{"id":6956,"date":"2026-05-12T18:15:07","date_gmt":"2026-05-12T15:15:07","guid":{"rendered":"https:\/\/nhrclb.org\/?p=6956"},"modified":"2026-05-12T18:24:14","modified_gmt":"2026-05-12T15:24:14","slug":"protecting-children-in-the-digital-environment%e2%94%82social-media-restrictions-platform-accountability-and-human-rights-implications-for-lebanon","status":"publish","type":"post","link":"https:\/\/nhrclb.org\/en\/archives\/6956","title":{"rendered":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon"},"content":{"rendered":"<h4>Author: Bassam Alkantar<\/h4>\n<div style=\"position: relative; width: 100%; height: 0; padding-bottom: 75%; overflow: hidden;\"><iframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%; border: 0;\" src=\"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/Protecting-Children-in-the-Digital-Environment-English.pdf\" allowfullscreen=\"allowfullscreen\"><br \/>\n<\/iframe><\/div>\n<p style=\"text-align: center; margin-top: 10px;\"><a href=\"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/Protecting-Children-in-the-Digital-Environment-English.pdf\" target=\"_blank\" rel=\"noopener\"><br \/>\nOpen or Download the PDF File<br \/>\n<\/a><\/p>\n<h2><strong>\u00a0\u00a0 1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Executive Summary<\/strong><\/h2>\n<p>Digital technologies have become an integral part of the lives of children and adolescents in Lebanon and around the world. Social media platforms now function as key spaces for communication, education, identity formation, entertainment, and civic participation. At the same time, these technologies expose children to serious risks, including cyberbullying, online harassment, harmful content, exploitative data practices, addictive design features, online grooming, and emerging harms linked to artificial intelligence. These developments have prompted governments to adopt or consider new forms of regulation. Yet the central legal and policy challenge remains how to protect children effectively without undermining their rights to freedom of expression, access to information, privacy, education, and participation.<\/p>\n<p>This report, issued by the National Human Rights Commission, which includes the Committee for the Prevention of Torture, examines this challenge from a human rights perspective, with a particular focus on Lebanon. It analyzes international regulatory trends, the relevant international legal framework, recent Lebanese legislative and policy developments, and the growing importance of cybercrime regulation and artificial intelligence governance. It concludes that while the protection of children in the digital environment is a legitimate and pressing objective, restrictive approaches based primarily on blanket access bans are unlikely to provide a complete or sustainable solution. A more effective approach requires a comprehensive and rights-based framework combining child protection, platform accountability, data protection, cybercrime cooperation, transparency, and regulation of artificial intelligence.<\/p>\n<p>The report first situates Lebanon within a broader international policy debate. In Australia, legislation adopted in December 2025 requires social media companies to prevent individuals under sixteen from opening or maintaining accounts. In Malaysia, similar restrictions have been proposed under the Online Safety Act 2025, potentially through digital identity verification systems. By contrast, the European Union has largely favored platform accountability measures through instruments such as the Digital Services Act and the General Data Protection Regulation, which impose obligations on technology companies to mitigate systemic risks, protect user data, and restrict targeted advertising to minors. These comparative examples show that states are pursuing divergent regulatory paths, some centered on access restrictions, others on the regulation of platform business models and digital ecosystems.<\/p>\n<p>The report highlights the concerns raised by civil society organizations and human rights defenders, including Amnesty International, regarding age-based bans on children\u2019s access to social media. Such restrictions may be circumvented in practice, including through false age declarations or alternative accounts, which may push children into less visible and potentially less safe digital spaces. Measures relying on biometric data, facial recognition, or document-based age verification may also create significant risks to privacy and data protection. More fundamentally, access restrictions may interfere with children\u2019s rights to receive and impart information, engage in public debate, and participate in social and cultural life. The report therefore stresses that effective protection cannot be limited to restricting access, but must address the design, operation, and accountability of digital platforms themselves.<\/p>\n<p>International human rights law provides the governing framework for assessing these issues. Lebanon is bound by the Convention on the Rights of the Child and the International Covenant on Civil and Political Rights. Under these instruments, children are rights-holders entitled not only to protection from harm, but also to privacy, participation, freedom of expression, and access to information. General Comment No. 25 of the UN Committee on the Rights of the Child makes clear that states must ensure children can safely benefit from digital technologies while also protecting them from abuse and exploitation. Regulatory measures must therefore satisfy the principles of legality, necessity, proportionality, and respect for the evolving capacities of the child.<\/p>\n<p>A key finding of the report is that many digital harms affecting children arise not solely from children\u2019s presence online, but from the structure and incentives of the digital environment itself. Social media platforms rely heavily on algorithmic recommendation systems, surveillance-based advertising, extensive profiling, and interface designs intended to maximize engagement and prolong attention. These features can amplify harmful, sensational, or misleading content and expose children to patterns of dependency, manipulation, or exploitation. As a result, policies that focus exclusively on restricting children\u2019s access risk overlooking the deeper structural drivers of harm. The report therefore supports regulation that includes safety-by-design, stronger protections for children\u2019s personal data, algorithmic transparency, and corporate accountability.<\/p>\n<p>Within Lebanon, the report identifies important but still incomplete efforts to respond to these challenges. On 26 February 2026, the Council of Ministers adopted Decision No. 13, creating an inter-ministerial committee tasked with preparing a national strategy to regulate and guide children\u2019s use of the internet and digital applications. This initiative acknowledges that existing Lebanese laws, including the Penal Code, Law No. 422\/2002, Law No. 293\/2014, and Law No. 81\/2018 on Electronic Transactions and Personal Data, provide only partial protection in the digital sphere and do not amount to a comprehensive framework. The inclusion of the President of the National Human Rights Commission in this committee is a significant step, as it opens the possibility for independent human rights oversight in the development of digital policy.<\/p>\n<p>The report also examines the draft law submitted on 5 February 2026 by MP Tony Frangieh, which would prohibit children under fourteen from using social media platforms and require providers to implement age verification measures. The proposal also includes safeguards relating to children\u2019s data and sanctions for non-compliant platforms. While the draft law is motivated by genuine concerns regarding cyberbullying, harmful content, exploitation, and mental health, the report finds that it raises serious human rights questions. These include whether a complete prohibition is proportionate, whether less restrictive alternatives have been sufficiently explored, and whether the proposed age assurance systems can be implemented in a genuinely privacy-respecting manner. The report concludes that the draft law should not be assessed in isolation, but within a broader framework that also addresses platform design, education, prevention, remedy, and accountability.<\/p>\n<p>The role of the NHRC-CPT is central throughout this process. In line with General Comment No. 25, national human rights institutions have an important function in monitoring children\u2019s rights in the digital environment, assessing proposed legislation, raising awareness, and promoting compliance with international standards. For Lebanon, this means that the NHRC-CPT should be recognized not only as a participant in institutional consultations, but as an independent actor capable of reviewing draft laws, documenting harms, receiving complaints, and advocating for a child-sensitive and rights-based digital governance framework.<\/p>\n<p>The report further argues that digital safety for children cannot be achieved without a stronger legal framework for cybercrime and digital evidence. The transnational nature of online harms means that exploitation, harassment, cyberstalking, online grooming, and the dissemination of harmful content frequently involve actors, evidence, and infrastructures located across borders. In this context, the adoption by the UN General Assembly of the United Nations Convention against Cybercrime on 24 December 2024 represents an important development. The Convention establishes a global legal framework for harmonizing cybercrime offences, facilitating digital investigations, and enabling cross-border cooperation in obtaining electronic evidence, while requiring respect for privacy, freedom of expression, and due process.<\/p>\n<p>Lebanon has not yet acceded to this Convention. The report finds that accession would strengthen Lebanon\u2019s capacity to investigate and prosecute cybercrime, especially in cases involving children and cross-border digital infrastructures. It would also help modernize national law in relation to digital evidence, mutual legal assistance, and procedural safeguards. At present, Law No. 81\/2018 remains Lebanon\u2019s principal digital law, but it does not offer a sufficiently comprehensive framework for cybercrime cooperation, modern data protection oversight, or digital justice in line with recent international standards. The report therefore recommends that cybercrime regulation be integrated into child protection policy rather than treated as a separate field.<\/p>\n<p>An additional dimension addressed in the report is the protection of women and girls from technology-facilitated violence. The draft law on the protection of women from digital violence, submitted to Parliament on 25 February 2026, represents a significant legislative initiative to criminalize cyberstalking, online harassment, identity theft, electronic extortion, and the non-consensual dissemination of intimate images. Although primarily focused on women, many of its provisions also have relevance for children and adolescents, especially girls, who face similar forms of abuse in digital spaces. The report therefore considers this initiative an important part of the broader effort to create a safer and more accountable digital environment in Lebanon.<\/p>\n<p>The report also addresses the emerging challenge of artificial intelligence regulation. Lebanon has recently begun considering several legislative and institutional initiatives in this field. One proposal introduced in 2026 seeks to criminalize the creation, modification, or use of intimate or indecent images and videos generated or altered by artificial intelligence without consent. This draft law responds to the growing threat posed by deepfakes and synthetic media, which can inflict serious reputational, psychological, and social harm, especially on women, children, and other vulnerable groups. The report welcomes this development while emphasizing the importance of legal clarity, proportionality, and safeguards against misuse.<\/p>\n<p>At the institutional level, two separate governance models have been proposed. The first is the draft law establishing the Ministry of Information Technology and Artificial Intelligence, approved by the Council of Ministers in September 2025, which would centralize responsibility for national digital transformation, cybersecurity, personal data protection, and AI policy. The second is the draft law submitted on 4 June 2025 to create a National Artificial Intelligence Authority as an independent body responsible for preparing the national AI strategy, proposing regulatory frameworks, monitoring implementation, and supervising the ethical use of artificial intelligence. These proposals reflect growing recognition that artificial intelligence requires dedicated governance. However, the report finds that the current institutional landscape remains fragmented and lacks a single coherent, rights-based architecture.<\/p>\n<p>Civil society concerns reinforce this assessment. Organizations such as SMEX have raised important questions regarding digital sovereignty, transparency, public-private technology partnerships, and the risks of over-reliance on foreign technology providers. Concerns surrounding the Oracle training agreement announced in December 2025, the incomplete implementation of Law No. 81\/2018, and the absence of strong independent oversight mechanisms all illustrate the vulnerabilities of Lebanon\u2019s current digital governance framework. Without stronger safeguards, there is a risk that digital transformation may proceed faster than the legal and institutional protections needed to preserve privacy, accountability, and public trust.<\/p>\n<p>Against this background, the report concludes that Lebanon is at a decisive regulatory moment. The country has begun to recognize the need to protect children online, strengthen cybercrime responses, address digital violence, and regulate artificial intelligence. However, current measures remain dispersed across multiple draft laws, ministerial initiatives, and policy proposals that have not yet been consolidated into a coherent framework. The report therefore calls for a comprehensive national strategy that brings together children\u2019s rights, data protection, cybercrime regulation, platform accountability, digital literacy, AI governance, and institutional oversight under a single human rights-based vision.<\/p>\n<p>Finally, the report sets out recommendations and expected outcomes directed to the Lebanese Government, Parliament, ministries and public authorities, civil society organizations, and UN agencies and treaty bodies. These recommendations aim to support the development of a coordinated framework that protects children and other vulnerable groups, strengthens accountability for digital platforms and AI systems, improves transparency in digital governance, and aligns Lebanon\u2019s laws and institutions with international human rights standards. At the center of this framework, the report places the NHRC-CPT as an independent institution with a critical role in monitoring, advocacy, oversight, and public guidance.<\/p>\n<p>The report\u2019s central message is that children should not be treated merely as passive subjects of protection in the digital age. They are rights-holders entitled to safety, dignity, privacy, participation, freedom of expression, and access to information. Protecting them requires not only restrictions where justified, but also education, accountability, transparency, remedy, and institutional reform. A sustainable digital governance framework for Lebanon must therefore place human rights, democratic oversight, and the best interests of the child at its core.<\/p>\n<h2><strong>\u00a0\u00a0 2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Introduction<\/strong><\/h2>\n<p>Digital technologies have become deeply embedded in the everyday lives of children and adolescents. Social media platforms, messaging applications, video-sharing services, online gaming environments, and emerging artificial intelligence tools now shape how young people communicate, learn, access information, express themselves, and participate in social and political life. For many children, the digital environment is no longer separate from offline reality, but an essential extension of it. It offers significant opportunities for education, creativity, community-building, and civic engagement. At the same time, it exposes children to growing risks, including cyberbullying, harassment, harmful content, exploitation of personal data, online grooming, non-consensual image-sharing, addictive design practices, and new forms of manipulation enabled by algorithmic systems and synthetic media.<\/p>\n<p>These developments have generated increasing concern among governments, parents, educators, civil society organizations, and international human rights bodies. In response, a number of states have begun to explore legal and policy measures aimed at regulating children\u2019s access to digital platforms, especially social media. Yet such measures raise complex legal and ethical questions. Efforts to protect children online must not come at the expense of their rights to freedom of expression, privacy, access to information, education, and participation. International human rights law requires that any restrictions be lawful, necessary, proportionate, and consistent with the evolving capacities and best interests of the child.<\/p>\n<p>In Lebanon, these debates have become increasingly urgent. Recent legislative proposals, policy initiatives, and institutional discussions reflect growing recognition that the country lacks a coherent framework for governing children\u2019s digital lives, cybercrime, digital violence, and the use of artificial intelligence. This report examines these developments through a human rights lens. It argues that protecting children in the digital environment requires more than restrictive access measures. It requires a comprehensive, rights-based framework that addresses platform accountability, data protection, cybercrime cooperation, digital literacy, online gender-based violence, and the governance of artificial intelligence in a manner consistent with Lebanon\u2019s domestic and international legal obligations.<\/p>\n<h2><strong>\u00a0\u00a0 3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Global Policy Responses to Children\u2019s Social Media Use<\/strong><\/h2>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 3.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Australia<\/h3>\n<p>In December 2025, Australia adopted legislation requiring social media companies to prevent individuals under sixteen from opening or maintaining social media accounts. The law obliges platforms to implement age verification mechanisms and remove existing accounts belonging to minors.<a href=\"#_ftn1\" name=\"_ftnref1\"><sup>[1]<\/sup><\/a><\/p>\n<p>The Australian government justified the legislation as a measure intended to protect children from harmful online content, excessive screen time, and the psychological risks associated with prolonged exposure to social media.<\/p>\n<p>However, the legislation has generated significant debate among policymakers and civil society organizations. Survey data suggests that although a majority of Australians support the intention behind the ban, many believe that children will find ways to circumvent the restrictions and that enforcement may prove difficult.<a href=\"#_ftn2\" name=\"_ftnref2\"><sup>[2]<\/sup><\/a><\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 3.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Malaysia<\/h3>\n<p>In Malaysia, authorities have proposed similar restrictions under the Online Safety Act 2025, potentially requiring digital identity verification mechanisms for online users. Civil society groups have raised concerns that such measures could introduce significant risks to privacy and freedom of expression.<a href=\"#_ftn3\" name=\"_ftnref3\"><sup>[3]<\/sup><\/a><\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 3.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 European Union<\/h3>\n<p>Within the European Union, policymakers have generally favored regulatory approaches focusing on platform accountability rather than direct bans on children&#8217;s access to social media. Instruments such as the Digital Services Act and the General Data Protection Regulation impose obligations on technology companies to mitigate systemic risks, protect user data, and limit targeted advertising directed at minors. <a href=\"#_ftn4\" name=\"_ftnref4\"><sup>[4]<\/sup><\/a><\/p>\n<h2><strong>\u00a0\u00a0 4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Human Rights Concerns Raised by Civil Society<\/strong><\/h2>\n<p>The rapid expansion of digital technologies and social media platforms has generated an equally rapid increase in public debate regarding their impact on children\u2019s well-being, development, and rights. Governments across the world have increasingly explored legislative measures aimed at restricting children\u2019s access to social media platforms or imposing obligations on technology companies to mitigate digital harms. While many of these initiatives are motivated by legitimate concerns regarding children\u2019s safety online, civil society organizations, digital rights advocates, child protection experts, and human rights institutions have raised important concerns regarding the potential unintended consequences of such measures.<\/p>\n<p>These concerns do not challenge the objective of protecting children in digital environments. Rather, they emphasize that regulatory responses must remain consistent with international human rights standards and must address the root causes of digital harm rather than focusing solely on restricting children\u2019s access to technology. Civil society organizations have therefore called for balanced regulatory approaches that combine child protection measures with safeguards for freedom of expression, privacy, access to information, and participation in digital spaces.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Concerns regarding blanket restrictions on children\u2019s access to social media<\/h3>\n<p>One of the most widely debated policy responses to concerns about children\u2019s online safety has been the proposal to prohibit or restrict children\u2019s access to social media platforms below a certain age. Proposals of this kind have emerged in several jurisdictions, including Australia, the United Kingdom, and a number of other countries considering legislation addressing children\u2019s online safety.<\/p>\n<p>Civil society organizations have raised concerns that blanket prohibitions on children\u2019s access to social media may prove ineffective in practice. Research on digital behavior indicates that children and adolescents often possess significant digital literacy and technical adaptability. As a result, they may easily circumvent age-based restrictions by providing false age information, using alternative accounts, accessing platforms through shared devices, or migrating to less regulated digital spaces.<\/p>\n<p>Such outcomes may undermine the intended protective purpose of the restrictions. Rather than reducing children\u2019s exposure to harmful online environments, restrictive policies may push children toward platforms that operate with fewer safeguards, weaker moderation systems, or less oversight. In some cases, children may also seek access to online communities that are more difficult for parents, educators, or authorities to monitor, thereby potentially increasing exposure to risk.<\/p>\n<p>Civil society groups have therefore emphasized that policies focusing exclusively on restricting children\u2019s access to platforms may not effectively address the underlying causes of digital harm. Instead, they argue that regulation should focus on the design, operation, and accountability of digital platforms themselves.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Privacy risks associated with age verification mechanisms<\/h3>\n<p>Another major concern raised by civil society organizations relates to the age verification mechanisms that are often required to enforce restrictions on children\u2019s access to social media. In many legislative proposals, digital platforms are required to verify users\u2019 age in order to determine whether individuals are eligible to create or maintain accounts.<\/p>\n<p>Age verification systems may involve several technological approaches. These include requiring users to upload government-issued identity documents, relying on facial recognition or biometric verification technologies, using artificial intelligence to estimate age based on facial images, or linking online accounts to digital identity systems. While such systems may help enforce age-based restrictions, they may also introduce new risks related to privacy and data protection.<\/p>\n<p>Human rights organizations and digital rights groups have warned that many age verification systems require the collection and processing of highly sensitive personal data. These may include biometric identifiers, facial images, identity documents, or other personal information that can be used to verify identity. If such data is stored, processed, or shared by private companies without adequate safeguards, it may expose children and their families to significant privacy risks.<\/p>\n<p>The potential for data breaches, unauthorized data sharing, profiling, or misuse of personal information has raised concerns among civil society actors. In addition, the use of biometric technologies for age verification may contribute to the normalization of large-scale digital identity verification systems that could have broader implications for privacy and freedom of expression online.<\/p>\n<p>Civil society organizations therefore emphasize that any age verification systems introduced to protect children must adhere to strict principles of data minimization, purpose limitation, and privacy by design. They must also be subject to independent oversight and transparent regulatory safeguards to ensure that children\u2019s personal data is not exploited or exposed to misuse.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Risks to freedom of expression and access to information<\/h3>\n<p>Civil society organizations have also emphasized that restrictions on children\u2019s access to digital platforms may have implications for their rights to freedom of expression and access to information. These rights are protected under international human rights instruments, including the Convention on the Rights of the Child and the International Covenant on Civil and Political Rights.<\/p>\n<p>In the contemporary digital environment, social media platforms serve as major channels through which individuals exchange ideas, access news and educational resources, and participate in public debate. Young people frequently rely on these platforms to access information about social issues, educational opportunities, and community initiatives. They also use digital platforms to express opinions, share experiences, and participate in discussions affecting their communities.<\/p>\n<p>Civil society organizations therefore caution that overly restrictive measures may limit children\u2019s ability to engage with information and public discourse. While protection from harmful content remains a legitimate concern, regulatory approaches must ensure that children retain meaningful opportunities to exercise their rights to communication, participation, and expression.<\/p>\n<p>These concerns are particularly relevant in societies where traditional media environments may not fully reflect the diversity of youth perspectives or where digital platforms provide important spaces for civic engagement and community-building. Limiting children\u2019s access to digital spaces without providing alternative channels for participation may inadvertently silence their voices in public debate.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Structural drivers of digital harm<\/h3>\n<p>In recent years, civil society organizations and academic researchers have increasingly emphasized that many harms experienced by children online arise from the structural design of digital platforms rather than simply from children\u2019s presence online. Social media platforms are typically built around business models that rely on maximizing user engagement and collecting large volumes of personal data.<\/p>\n<p>Algorithmic recommendation systems are central to these models. These systems analyze user behavior in order to recommend content that is most likely to attract attention and encourage continued engagement. While such systems may enhance user experience, they may also amplify sensational, emotionally provocative, or controversial content because such material often generates higher engagement.<\/p>\n<p>For children and adolescents, this dynamic may result in prolonged exposure to harmful or misleading content. Recommendation algorithms may also reinforce echo chambers or expose young users to material that may negatively affect their psychological well-being.<\/p>\n<p>In addition to algorithmic amplification, the data-driven advertising models used by many social media platforms rely heavily on the collection and profiling of user data. These systems track user behavior across platforms, build detailed profiles of individual users, and use these profiles to deliver targeted advertising.<\/p>\n<p>Civil society organizations have raised concerns that such practices may be particularly problematic when applied to children. Young users may not fully understand the implications of data collection or targeted advertising practices. As a result, children may be exposed to manipulative commercial messaging or behavioral targeting that exploits their vulnerabilities.<\/p>\n<p>From this perspective, civil society organizations argue that regulatory responses should focus not only on children\u2019s access to platforms but also on the broader structural features of digital ecosystems. Measures such as restrictions on targeted advertising to minors, transparency requirements for algorithmic systems, and obligations for platforms to conduct risk assessments may therefore play a critical role in protecting children online.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Digital literacy and empowerment<\/h3>\n<p>Civil society actors also emphasize the importance of education and digital literacy in protecting children from online harm. Rather than relying exclusively on restrictive regulatory measures, many organizations advocate for strategies that empower children, parents, and educators to navigate digital environments safely and responsibly.<\/p>\n<p>Digital literacy initiatives may include educational programs addressing online safety, privacy protection, responsible communication, and critical evaluation of digital content. These initiatives can help children develop the skills necessary to recognize harmful content, avoid online exploitation, and respond appropriately to cyberbullying or harassment.<\/p>\n<p>Parents and educators also play an important role in guiding children\u2019s digital experiences. Civil society organizations therefore encourage the development of support resources and awareness programs that assist families in understanding digital technologies and promoting healthy digital habits.<\/p>\n<p>Such initiatives recognize that children are not merely passive recipients of digital content but active participants in digital environments. Empowering children to understand and navigate these environments may therefore provide more sustainable protection than restrictive measures alone.<\/p>\n<p>&nbsp;<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 4.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Toward a balanced regulatory approach<\/h3>\n<p>The concerns raised by civil society organizations do not imply opposition to the regulation of digital platforms or the adoption of policies aimed at protecting children online. On the contrary, many organizations strongly support the development of comprehensive regulatory frameworks addressing digital harms.<\/p>\n<p>However, civil society actors consistently emphasize that effective regulation must balance protective objectives with respect for human rights. Regulatory frameworks should therefore include measures addressing platform accountability, data protection, algorithmic transparency, and corporate responsibility while preserving children\u2019s rights to expression, information, and participation.<\/p>\n<p>Such an approach recognizes that digital technologies present both opportunities and risks for children. Protecting children in the digital age therefore requires policies that combine safeguards against harm with measures enabling children to benefit from digital innovation in safe and empowering ways.<\/p>\n<p>In this context, the role of human rights institutions, policymakers, educators, civil society organizations, and technology companies becomes essential. Only through coordinated efforts among these actors can regulatory frameworks be developed that effectively protect children while upholding the fundamental rights and freedoms that underpin democratic societies.<\/p>\n<h2><strong>\u00a0\u00a0 5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>International Human Rights Framework<\/strong><\/h2>\n<p>&nbsp;<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 5.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Convention on the Rights of the Child<\/h3>\n<p>Lebanon ratified the Convention on the Rights of the Child in 1991. The Convention establishes several rights relevant to children\u2019s participation in digital environments.<\/p>\n<p>Article 13 guarantees children the right to freedom of expression, including the freedom to seek, receive, and impart information through any media.<\/p>\n<p>Article 17 recognizes the importance of ensuring children\u2019s access to information that contributes to their development.<\/p>\n<p>Article 16 protects children\u2019s right to privacy.<\/p>\n<p>States are required to ensure that protective measures adopted in the digital environment respect the principle of proportionality and the evolving capacities of the child. <a href=\"#_ftn5\" name=\"_ftnref5\"><sup>[5]<\/sup><\/a><\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 5.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 International Covenant on Civil and Political Rights<\/h3>\n<p>Lebanon is also a party to the International Covenant on Civil and Political Rights. Article 19 protects freedom of expression and access to information.<\/p>\n<p>Restrictions on these rights must satisfy the criteria of legality, necessity, and proportionality under international law. <a href=\"#_ftn6\" name=\"_ftnref6\"><sup>[6]<\/sup><\/a><\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 5.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 UN Committee on the Rights of the Child<\/h3>\n<p>In General Comment No. 25 on children\u2019s rights in the digital environment, the Committee emphasized that states must ensure children can safely benefit from digital technologies while protecting them from online harms.<a href=\"#_ftn7\" name=\"_ftnref7\"><sup>[7]<\/sup><\/a><\/p>\n<h2><strong>\u00a0\u00a0 6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Structural Drivers of Digital Harm<\/strong><\/h2>\n<p>Growing research in the fields of digital governance, technology policy, and human rights demonstrates that many online harms experienced by children and adolescents do not arise solely from their presence on digital platforms. Instead, these harms are often deeply connected to the underlying economic incentives, technological architecture, and design features of the platforms themselves. Social media companies operate within business models that prioritize the capture of user attention, the collection of personal data, and the monetization of behavioral information through targeted advertising systems. These structural characteristics shape how information circulates online and how users, including children, interact with digital environments.<\/p>\n<p>Amnesty International\u2019s report <em>Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights<\/em> (2019) provides a critical analysis of these dynamics. The report argues that the dominant business model of major technology platforms relies on what has been described as \u201csurveillance capitalism,\u201d a system in which companies systematically collect, analyze, and monetize vast quantities of personal data in order to predict and influence user behavior. This model incentivizes platform designs that maximize user engagement and data extraction rather than prioritizing user well-being, safety, or the protection of fundamental rights. <a href=\"#_ftn8\" name=\"_ftnref8\"><sup>[8]<\/sup><\/a><\/p>\n<p>Within this framework, several structural drivers of digital harm can be identified. These include algorithmic recommendation systems designed to maximize engagement, surveillance-based advertising models relying on extensive personal data collection and profiling, and interface design mechanisms engineered to prolong user attention and encourage repeated interaction. Together, these systems create digital environments in which children may be exposed to amplified risks, including harmful content, misinformation, online harassment, and patterns of excessive or compulsive digital use.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Algorithmic recommendation systems and the amplification of harmful content<\/h3>\n<p>One of the most influential structural features of contemporary social media platforms is the use of algorithmic recommendation systems. These systems determine which content appears in users\u2019 feeds, which videos are suggested next, and which posts are most prominently displayed. Rather than presenting information chronologically or randomly, platforms rely on complex algorithms that analyze user behavior in order to predict which content is most likely to capture attention and encourage continued engagement.<\/p>\n<p>These algorithms rely on a wide range of behavioral signals, including the amount of time users spend viewing certain content, the posts they like or share, the accounts they follow, and their patterns of interaction with other users. By analyzing these signals, algorithmic systems attempt to identify content that is likely to generate strong emotional reactions or prolonged attention.<\/p>\n<p>While such systems can enhance user experience by recommending content that aligns with user interests, they also create powerful incentives for the amplification of sensational or emotionally charged material. Research indicates that content provoking anger, outrage, fear, or controversy often generates higher levels of engagement. As a result, algorithmic recommendation systems may disproportionately promote content that is polarizing, misleading, or harmful.<\/p>\n<p>For children and adolescents, the implications of this dynamic can be significant. Young users may encounter content that promotes unrealistic body standards, self-harm behaviors, harmful challenges, disinformation, or extreme viewpoints. Once users interact with such content, algorithms may continue recommending similar material, reinforcing exposure through a process often described as algorithmic \u201crabbit holes.\u201d<\/p>\n<p>Amnesty International and other human rights organizations have argued that these algorithmic amplification mechanisms are not accidental but are closely tied to the economic incentives of the platform business model. Since advertising revenue is directly linked to the amount of time users spend on platforms, companies have strong incentives to design systems that maximize engagement even when such engagement may expose users to harmful content.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Surveillance-based advertising and data extraction<\/h3>\n<p>Another key structural driver of digital harm is the surveillance-based advertising model that underpins the business strategies of many major technology platforms. Under this model, companies collect extensive information about users\u2019 online behavior, interests, relationships, and preferences. This information is then used to construct detailed behavioral profiles that enable highly targeted advertising.<\/p>\n<p>Amnesty International\u2019s analysis describes this system as one in which users are continuously monitored across digital environments. Data may be collected not only from activity on a single platform but also from interactions across multiple websites and applications. Through tracking technologies such as cookies, device identifiers, and embedded software development kits, platforms can build detailed records of users\u2019 browsing habits, search queries, location data, and interactions with online content.<\/p>\n<p>For children, such data collection raises serious concerns regarding privacy and autonomy. Young users may not fully understand how their data is collected or how it is used for commercial purposes. They may also be particularly vulnerable to targeted advertising practices that exploit emotional or developmental vulnerabilities.<\/p>\n<p>Targeted advertising systems may promote products, services, or content that aligns with the behavioral profiles generated by data analysis. For example, users who interact with certain types of content may receive advertisements related to dieting products, cosmetic procedures, or other potentially harmful messaging. In some cases, such advertisements may reinforce harmful stereotypes or encourage behaviors that negatively affect children\u2019s well-being.<\/p>\n<p>The surveillance-based advertising model also creates incentives for platforms to collect as much personal data as possible. The more data a company gathers, the more accurately it can predict user behavior and deliver targeted advertisements. As Amnesty International argues, this dynamic can create systemic pressures to expand data collection practices, often in ways that undermine privacy rights and limit individuals\u2019 control over their personal information.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Interface design and attention capture<\/h3>\n<p>In addition to algorithmic recommendation systems and surveillance-based advertising models, many social media platforms employ interface design strategies aimed at maximizing user engagement. These design features are often informed by behavioral psychology and are specifically intended to encourage users to remain on platforms for extended periods of time.<\/p>\n<p>Examples of such features include infinite scrolling mechanisms, autoplay functions for video content, push notifications that alert users to new interactions, and visual indicators such as \u201clikes\u201d or engagement counters that reinforce social validation. These design elements can create feedback loops that encourage repeated interaction and prolonged use.<\/p>\n<p>For children and adolescents, these design mechanisms may contribute to patterns of excessive screen time or compulsive engagement with digital platforms. Young users may experience pressure to remain constantly connected in order to maintain social relationships, respond to messages, or monitor online interactions.<\/p>\n<p>Psychological research suggests that features such as intermittent rewards and social validation cues can activate behavioral responses similar to those observed in gambling environments. Notifications, likes, and comments provide small bursts of social feedback that encourage users to check platforms repeatedly.<\/p>\n<p>Amnesty International and other organizations have argued that such design practices raise ethical questions when applied to young users. If platforms are intentionally designed to capture and retain attention, children may find it difficult to disengage from digital environments even when prolonged use negatively affects sleep patterns, academic performance, or mental health.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Exposure to misinformation and harmful narratives<\/h3>\n<p>Another consequence of engagement-driven platform design is the rapid spread of misinformation and harmful narratives. Algorithmic systems designed to prioritize engagement may inadvertently promote misleading or inaccurate information if such content generates high levels of user interaction.<\/p>\n<p>In the context of children\u2019s digital experiences, misinformation may include false health advice, conspiracy theories, or distorted representations of social issues. Young users who lack the skills or experience to critically evaluate online information may be particularly vulnerable to such content.<\/p>\n<p>Moreover, algorithmic systems may amplify communities or networks that promote harmful behaviors, including harassment campaigns, extremist narratives, or discriminatory ideologies. While platforms have introduced moderation policies aimed at reducing harmful content, enforcement challenges remain significant given the scale and speed at which information circulates online.<\/p>\n<p>Civil society organizations have therefore emphasized the importance of transparency and accountability in the design and operation of algorithmic systems. Without greater transparency regarding how recommendation algorithms function, it remains difficult for regulators, researchers, and the public to assess their impact on user well-being and democratic discourse.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Implications for children\u2019s rights<\/h3>\n<p>The structural drivers of digital harm described above have important implications for the protection of children\u2019s rights in the digital environment. The Convention on the Rights of the Child recognizes that children are entitled not only to protection from harm but also to privacy, freedom of expression, access to information, and participation in cultural and social life.<\/p>\n<p>When digital platforms are designed in ways that prioritize data extraction and attention capture, these rights may be affected. Extensive data collection practices may undermine children\u2019s right to privacy. Algorithmic amplification of harmful content may expose children to material that affects their well-being or development. Engagement-driven design features may contribute to patterns of digital dependency that affect mental health.<\/p>\n<p>From a human rights perspective, addressing these challenges requires more than restricting children\u2019s access to digital technologies. It requires examining the structural incentives that shape how platforms operate and how digital environments are designed.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 6.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Toward structural regulation of digital platforms<\/h3>\n<p>Recognizing the structural drivers of digital harm has led many policymakers and human rights advocates to call for regulatory approaches that address the responsibilities of technology companies themselves. Such approaches may include requirements for platforms to conduct human rights impact assessments of their technologies, transparency obligations regarding algorithmic systems, restrictions on targeted advertising directed at minors, and stronger protections for children\u2019s personal data.<\/p>\n<p>Regulators may also consider age-appropriate design standards requiring platforms to prioritize safety and well-being in products used by children. These standards could include limits on addictive design features, clearer privacy protections, and enhanced reporting mechanisms for harmful content.<\/p>\n<p>Ultimately, protecting children in digital environments requires a shift in regulatory focus. Instead of treating children\u2019s presence online as the primary source of risk, policymakers must recognize that many harms arise from the structure and incentives of the digital ecosystem itself. Addressing these structural drivers is therefore essential for creating digital environments that respect children\u2019s rights, support their development, and ensure that technological innovation advances in ways that are consistent with human dignity and human rights.<\/p>\n<h2><strong>\u00a0\u00a0 7.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Legal Framework in Lebanon<\/strong><\/h2>\n<p>Recent developments in Lebanon demonstrate growing recognition of the need to address risks associated with children\u2019s use of digital technologies.<\/p>\n<p>While several laws provide partial protections relevant to children in digital environments, Lebanon currently lacks a comprehensive regulatory framework addressing children\u2019s access to social media platforms.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 7.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Cabinet Initiative to Develop a National Strategy<\/h3>\n<p>The issue of children\u2019s access to the internet and social media platforms in Lebanon has recently been addressed through emerging legislative and policy initiatives. On 26 February 2026, the Lebanese Council of Ministers adopted Decision No. 13 (Minutes No. 52), approving the formation of a joint ministerial committee tasked with preparing a comprehensive national strategy to regulate and guide the use of the internet and digital applications by children. The decision was adopted following a proposal submitted by the Ministry of Information and is grounded in several existing legal instruments, including the Lebanese Penal Code (Legislative Decree No. 340\/1943), the Law on the Protection of Juveniles in Conflict with the Law or at Risk (Law No. 422\/2002), the Law on the Protection of Women and Other Family Members from Domestic Violence (Law No. 293\/2014), and the Law on Electronic Transactions and Personal Data (Law No. 81\/2018). The Cabinet decision recognizes that, while these laws provide partial protection in the digital sphere, Lebanon lacks a comprehensive policy or regulatory framework addressing the risks associated with children\u2019s use of the internet and social media. It therefore mandates the establishment of an inter-ministerial committee composed of the Ministers of Information, Justice, Telecommunications, Social Affairs, Technology and Artificial Intelligence, Education and Higher Education, Interior and Municipalities, and Environment, in addition to the head of the national team responsible for combating cybercrime and the President of the National Human Rights Commission. The committee is tasked with developing a national plan to guide and regulate children\u2019s use of internet platforms and applications, in coordination with relevant public institutions, civil society actors, and international organizations such as UNICEF. This initiative is framed within Lebanon\u2019s obligations under international human rights law, particularly the Convention on the Rights of the Child, which Lebanon ratified in 1990 and which requires the state to take appropriate measures to ensure the protection and best interests of the child, including in the digital environment.<\/p>\n<p>The committee includes representatives from multiple ministries, the national cybercrime unit, and the President of the National Human Rights Commission.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 7.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Draft Law on Prohibiting the Use of Social Media by Minors<\/h3>\n<p>The draft law proposed by Member of Parliament Tony Frangieh on 25 February 2026 seeks to establish a legal framework regulating children\u2019s access to social media platforms in Lebanon. The proposal defines a minor as any person under the age of fourteen and prohibits social media platforms from creating or activating accounts for individuals below this age threshold. It requires service providers operating within Lebanon to adopt effective and privacy-respecting age verification mechanisms, including parental verification systems, digital age verification tools, or artificial intelligence-based technologies to ensure compliance with the minimum age requirement. The draft law also introduces safeguards aimed at protecting minors\u2019 personal data, prohibiting platforms from collecting, exploiting, or selling children\u2019s data for commercial or media purposes. Enforcement provisions include criminal sanctions for non-compliant platforms, ranging from three months to two years of imprisonment and fines between five and twenty times the official minimum wage, in addition to the possibility for the Ministry of Telecommunications to suspend platform operations partially or entirely in cases of repeated violations. The proposal provides limited exceptions for electronic learning platforms and educational communication applications and grants service providers a three month period following the publication of the law in the Official Gazette to comply with its provisions. Implementation measures are to be developed jointly by the Ministries of Telecommunications and Social Affairs. The accompanying explanatory memorandum highlights the growing risks associated with early exposure to social media, including cyberbullying, exposure to harmful content, online exploitation, and negative impacts on children\u2019s mental health, while emphasizing Lebanon\u2019s obligations under international law, particularly the Convention on the Rights of the Child, to ensure the protection of children in the digital environment.<\/p>\n<p>&nbsp;<\/p>\n<h2><strong>\u00a0\u00a0 8.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Assessment of the Lebanese Proposal in Light of International Human Rights Standards<\/strong><\/h2>\n<p>The draft law proposed in Lebanon to prohibit social media use by minors under the age of fourteen reflects legitimate concerns regarding the potential risks associated with children\u2019s exposure to digital platforms. These risks include cyberbullying, exposure to harmful content, online exploitation, and psychological harms associated with excessive digital engagement. Nevertheless, legislative initiatives regulating children\u2019s access to digital technologies must be assessed within the framework of international human rights law, particularly the standards articulated by the United Nations Committee on the Rights of the Child and other international bodies addressing children\u2019s rights in the digital environment.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Compatibility with the Convention on the Rights of the Child<\/h3>\n<p>The Convention on the Rights of the Child requires States Parties to ensure that children\u2019s rights are respected, protected, and fulfilled in all environments, including the digital sphere. The UN Committee on the Rights of the Child clarified these obligations in General Comment No. 25 on children\u2019s rights in relation to the digital environment, adopted in 2021. The Committee emphasized that the digital environment has become central to the realization of children\u2019s rights and that technological innovations affect children\u2019s civil, political, economic, social, and cultural rights in interconnected ways. <a href=\"#_ftn9\" name=\"_ftnref9\"><sup>[9]<\/sup><\/a><\/p>\n<p>General Comment No. 25 affirms that meaningful access to digital technologies can support children in exercising a wide range of rights, including the rights to education, access to information, freedom of expression, cultural participation, and social development. At the same time, the Committee recognizes that digital environments can expose children to significant risks, including exploitation, privacy violations, harmful content, and abusive conduct.<\/p>\n<p>Consequently, the Committee calls on states to adopt regulatory frameworks that both protect children from harm and enable them to benefit from digital technologies. Importantly, the Committee stresses that protective measures must respect the principles of proportionality, necessity, and the evolving capacities of the child. Blanket restrictions that entirely exclude children from digital environments may therefore raise concerns if they disproportionately limit children\u2019s rights to information, participation, and expression.<\/p>\n<p>In the Lebanese context, the proposed age-based prohibition raises questions regarding whether a complete restriction on access to social media platforms for children under fourteen constitutes a proportionate response to the identified risks. While the protection of children is a legitimate objective under international law, states must demonstrate that restrictive measures are necessary and that less restrictive alternatives would not achieve the same protective objective.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Age Verification and Privacy Concerns<\/h3>\n<p>The draft law requires digital platforms to implement age verification mechanisms in order to prevent minors from creating accounts. While age assurance technologies may contribute to the protection of children online, international human rights bodies have highlighted the potential privacy risks associated with such mechanisms.<\/p>\n<p>General Comment No. 25 stresses that the collection and processing of children\u2019s personal data in digital environments must respect the highest standards of privacy protection. States are required to ensure that digital service providers adopt privacy-by-design and data minimization principles when processing children\u2019s data.<\/p>\n<p>Many age verification systems rely on the submission of sensitive personal data, including government-issued identification documents, facial recognition technologies, or biometric verification tools. These mechanisms may create new risks for children if personal data is stored or processed by private companies with inadequate safeguards.<\/p>\n<p>Human rights organizations have therefore cautioned that age verification systems must be carefully designed to avoid unnecessary collection of personal data and to ensure compliance with international privacy standards. From a human rights perspective, age assurance measures should prioritize privacy-preserving technologies and independent oversight mechanisms to prevent misuse of children\u2019s personal information.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Structural Drivers of Harm in Digital Platforms<\/h3>\n<p>Another key issue concerns the underlying causes of many digital harms experienced by children. Research and policy analysis conducted by human rights organizations, including Amnesty International\u2019s technology and human rights programme, indicate that many risks associated with social media use are linked to the structural design of digital platforms rather than simply to children\u2019s presence online.<\/p>\n<p>Many social media platforms rely on surveillance-based business models that depend on extensive data collection and algorithmic recommendation systems designed to maximize user engagement. These systems can amplify sensational or emotionally provocative content, which may include harmful or misleading material. As a result, users, including children, may be exposed to content that undermines their well-being or psychological health.<\/p>\n<p>From this perspective, regulatory approaches focusing solely on restricting access for young users may fail to address the structural drivers of digital harm. International human rights experts increasingly argue that regulatory frameworks should focus on ensuring that digital platforms operate in ways that respect human rights, including children\u2019s rights.<\/p>\n<p>Such frameworks may include requirements for algorithmic transparency, limits on targeted advertising directed at minors, and obligations for platforms to conduct human rights impact assessments of their technologies.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Children\u2019s Rights to Participation and Access to Information<\/h3>\n<p>International human rights law recognizes that children are active participants in social and civic life rather than passive recipients of protection. The Convention on the Rights of the Child affirms children\u2019s rights to express their views, access information, and participate in cultural and social activities.<\/p>\n<p>In the digital era, social media platforms often serve as important spaces where young people engage in political debate, share experiences, and access educational resources. General Comment No. 25 therefore emphasizes that states must ensure that children can meaningfully participate in digital environments while being protected from harm.<\/p>\n<p>Blanket restrictions on children\u2019s access to social media platforms may inadvertently limit opportunities for participation and engagement. For example, young people frequently use digital platforms to access information about public affairs, participate in social movements, and express views on issues affecting their communities.<\/p>\n<p>Regulatory approaches should therefore aim to strike a balance between protection and participation. Rather than excluding children from digital spaces entirely, policymakers may consider measures that promote safer digital environments while preserving opportunities for children to exercise their rights.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Role of National Human Rights Institutions<\/h3>\n<p>General Comment No. 25 highlights the role of national human rights institutions in monitoring and protecting children\u2019s rights in the digital environment. The Committee recommends that national human rights institutions be empowered to receive complaints from children, investigate digital rights violations, and provide oversight regarding the implementation of policies affecting children online.<\/p>\n<p>In Lebanon, the involvement of the National Human Rights Commission in the inter-ministerial committee established by the Council of Ministers represents an important step toward ensuring that children\u2019s digital rights are considered within a broader human rights framework.<\/p>\n<p>National human rights institutions can play a critical role in assessing the human rights implications of proposed legislation, promoting public awareness of children\u2019s digital rights, and ensuring that regulatory frameworks are consistent with international standards.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 8.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Toward a Balanced Regulatory Approach<\/h3>\n<p>The Lebanese draft law reflects legitimate concerns regarding children\u2019s safety in digital environments. However, international human rights standards suggest that effective regulation should address both the risks associated with children\u2019s digital engagement and the responsibilities of technology companies.<\/p>\n<p>A balanced regulatory framework may therefore combine several elements, including age-appropriate design standards for digital platforms, stronger protections for children\u2019s personal data, algorithmic accountability mechanisms, and digital literacy initiatives aimed at empowering children and parents to navigate digital environments safely.<\/p>\n<p>Such an approach aligns with international human rights standards and recognizes that protecting children in the digital age requires cooperation among governments, technology companies, educators, and civil society organizations.<\/p>\n<p>&nbsp;<\/p>\n<h2><strong>\u00a0\u00a0 9.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/strong><strong>Strengthening the Legal Framework for Digital Safety and Cybercrime<\/strong><\/h2>\n<p>The effective protection of children in digital environments requires not only policies addressing access to social media platforms, but also a comprehensive legal framework capable of responding to the broader challenges posed by cybercrime, digital evidence, and cross-border digital investigations. The global nature of online platforms means that many digital harms affecting children, including cyberbullying, online exploitation, harassment, and the dissemination of harmful content, frequently involve actors, platforms, and infrastructure located across multiple jurisdictions. As a result, national legal frameworks must increasingly operate within broader systems of international cooperation.<\/p>\n<p>In this context, recent developments in international law have strengthened the regulatory architecture governing cybercrime and digital evidence. The adoption of the United Nations Convention against Cybercrime represents a significant step toward establishing a global legal framework for addressing crimes committed through information and communications technologies while ensuring that investigative measures respect human rights and fundamental freedoms.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 The United Nations Convention against Cybercrime<\/h3>\n<p>On 24 December 2024, the United Nations General Assembly adopted the United Nations Convention against Cybercrime through Resolution 79\/243. The Convention represents the first comprehensive global treaty specifically aimed at addressing crimes committed through information and communications technology systems. The adoption of the Convention reflects growing international recognition that cybercrime constitutes a major transnational challenge requiring coordinated international responses.<\/p>\n<p>The Convention establishes a legal framework designed to strengthen the capacity of States to prevent, investigate, and prosecute cybercrime while facilitating international cooperation in the collection and exchange of electronic evidence. It seeks to harmonize national criminal legislation relating to cybercrime offences, develop investigative tools adapted to digital environments, and create mechanisms for cross-border cooperation among law enforcement authorities.<\/p>\n<p>The treaty also recognizes that the investigation of cybercrime frequently requires access to digital evidence located outside the territorial jurisdiction of the investigating state. As a result, the Convention provides procedures aimed at facilitating international cooperation, including mutual legal assistance, information sharing, and mechanisms for obtaining electronic evidence stored abroad.<\/p>\n<p>At the same time, the Convention emphasizes the importance of ensuring that investigative powers in the digital sphere are exercised in a manner consistent with international human rights law. It therefore includes provisions requiring States Parties to ensure that measures adopted to combat cybercrime respect fundamental rights and freedoms, including the right to privacy, freedom of expression, and due process guarantees.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Entry into Force and Signature Process<\/h3>\n<p>In accordance with Article 64 of the Convention, the treaty was opened for signature in Hanoi on 25 and 26 October 2025 and subsequently at United Nations Headquarters in New York until 31 December 2026. Article 65 further provides that the Convention will enter into force ninety days after the deposit of the fortieth instrument of ratification, acceptance, or accession.<\/p>\n<p>Once in force, the Convention is expected to serve as a key international instrument governing cooperation in cybercrime investigations and the exchange of electronic evidence between states. It aims to strengthen international legal cooperation, develop procedural safeguards for digital investigations, and promote respect for human rights in the context of combating cybercrime.<\/p>\n<p>For states facing increasing challenges related to digital crime, participation in the Convention may significantly enhance the ability of national authorities to investigate offences involving cross-border digital infrastructures.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Relevance of the Convention for the Protection of Children<\/h3>\n<p>The growing use of digital platforms by children has been accompanied by an increase in online harms affecting minors. These harms include cyberbullying, harassment, the distribution of harmful or exploitative content, online grooming, and other forms of digital abuse. Many of these offences involve actors operating across multiple jurisdictions or using digital infrastructure hosted in different countries.<\/p>\n<p>International cooperation mechanisms therefore play a critical role in enabling national authorities to investigate and prosecute such crimes effectively. The United Nations Convention against Cybercrime aims to strengthen such cooperation by establishing shared procedural frameworks for digital investigations and evidence gathering.<\/p>\n<p>In the context of children\u2019s protection, the Convention may contribute to improving law enforcement responses to offences involving the exploitation or abuse of children in digital environments. By facilitating access to digital evidence and enabling cross-border investigations, the treaty may enhance the ability of national authorities to identify perpetrators and protect victims.<\/p>\n<p>At the same time, the Convention highlights the need for procedural safeguards ensuring that investigative powers exercised in the digital sphere do not undermine human rights. Investigations involving digital communications, surveillance technologies, or data interception must therefore comply with international human rights standards governing privacy, due process, and freedom of expression.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 The Lebanese Legal Framework<\/h3>\n<p>At the national level, Lebanon has adopted several legislative measures relevant to the regulation of the digital environment. The most significant of these is Law No. 81 of 10 October 2018 on Electronic Transactions and Personal Data, which provides the principal legal framework governing electronic communications, electronic signatures, and aspects of personal data protection.<\/p>\n<p>Law No. 81\/2018 establishes rules governing electronic transactions, digital authentication mechanisms, and certain aspects of cybersecurity. It also introduces provisions addressing the protection of personal data processed through electronic systems. These provisions are intended to regulate the collection, storage, and processing of personal data by public and private entities operating in Lebanon.<\/p>\n<p>While Law No. 81\/2018 represents an important step toward regulating digital activities within the country, it does not fully address the broader challenges associated with cybercrime and digital investigations. In particular, the law does not establish a comprehensive framework for international cooperation in cybercrime investigations or the exchange of electronic evidence with foreign jurisdictions.<\/p>\n<p>Moreover, the law was adopted before the recent acceleration of global policy debates concerning digital platform regulation, algorithmic accountability, and children\u2019s rights in digital environments. As a result, additional legislative reforms may be necessary to ensure that Lebanon\u2019s legal framework remains aligned with evolving international standards.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Lebanon and the United Nations Convention against Cybercrime<\/h3>\n<p>Lebanon has not yet acceded to the United Nations Convention against Cybercrime. Accession to the treaty could contribute significantly to strengthening Lebanon\u2019s capacity to investigate and prosecute cybercrime while facilitating international cooperation in obtaining electronic evidence.<\/p>\n<p>Participation in the Convention would allow Lebanese authorities to benefit from the treaty\u2019s mechanisms for mutual legal assistance and cross-border cooperation. This could be particularly important in cases involving digital offences that affect children and involve platforms or perpetrators located outside Lebanese territory.<\/p>\n<p>Accession could also contribute to aligning Lebanon\u2019s legal framework with evolving international standards concerning digital justice, data protection, and human rights in the digital environment. The Convention emphasizes the need to ensure that cybercrime investigations respect fundamental rights and procedural safeguards, including protections for privacy and freedom of expression.<\/p>\n<p>For Lebanon, adopting such standards could strengthen both national cybercrime responses and broader efforts to regulate digital platforms in ways that protect users, including children, from harm.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Integrating Cybercrime Regulation into Child Protection Policies<\/h3>\n<p>Addressing online risks affecting children requires an integrated approach combining criminal law enforcement, platform regulation, data protection safeguards, and educational initiatives promoting digital literacy. Cybercrime legislation alone cannot fully address the complex challenges posed by children\u2019s engagement with digital technologies. However, effective legal frameworks for investigating digital offences remain an essential component of broader strategies aimed at protecting children online.<\/p>\n<p>Strengthening Lebanon\u2019s legal capacity to address cybercrime may therefore complement other policy initiatives, including the development of national strategies regulating children\u2019s use of digital technologies and the promotion of safer online environments.<\/p>\n<p>By aligning domestic legislation with international legal instruments such as the United Nations Convention against Cybercrime, Lebanon may enhance its ability to combat digital offences while ensuring that regulatory responses remain consistent with international human rights standards.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 9.7.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 The Protection of Women from Digital and Online Violence<\/h3>\n<p>A Draft Law on the Protection of Women from Digital Violence was submitted to the Lebanese Parliament on 25 February 2026. The proposal was prepared with the support of the civil society organization \u201cFEMALE\u201d and introduced in Parliament by several Members of Parliament, including Bilal Abdallah, Paula Yacoubian, Jamil El-Sayyed, Tony Frangieh, Nada Boustani, Halima Kaakour, Saji Attieh, and Michel Douaihy. The draft law aims to establish a comprehensive legal framework to prevent, criminalize, and respond to different forms of technology-facilitated violence targeting women. It provides a broad definition of digital violence against women, encompassing acts committed through information and communication technologies that undermine women\u2019s dignity, privacy, psychological well-being, or security. The proposal criminalizes a range of offences including cyberstalking, online harassment, identity theft, electronic extortion, the non-consensual dissemination of intimate images, and the misuse of digital platforms to threaten, intimidate, or exploit women.<\/p>\n<p>Beyond criminal sanctions, the proposal introduces protection mechanisms for victims, including judicial protection orders, accessible reporting mechanisms, and the provision of legal, social, and psychological support services. It also establishes preventive and institutional measures aimed at strengthening coordination between law enforcement authorities, judicial institutions, and civil society organizations, while promoting awareness and prevention strategies to address gender-based digital violence. The draft law therefore represents a significant legislative effort to address emerging forms of online gender-based violence and to enhance the protection of women\u2019s rights in Lebanon in line with international human rights standards.<\/p>\n<p>Although the Draft Law on the Protection of Women from Digital Violence (2025) primarily aims to address technology-facilitated violence against women, several of its provisions indirectly contribute to strengthening protections for children and adolescents in the digital environment. The law establishes a broad legal framework addressing forms of online abuse, harassment, and exploitation that frequently affect minors as well as adult women.<\/p>\n<p>First, the draft law criminalizes a range of online behaviors such as cyberstalking, online harassment, identity theft, electronic extortion, and the non-consensual dissemination of images or personal data. These acts are among the most common forms of digital harm experienced by children and adolescents, particularly girls. By criminalizing such conduct and imposing penalties on perpetrators, the proposed legislation contributes to deterring harmful behavior in online spaces and enhancing accountability for technology-facilitated abuse.<\/p>\n<p>Second, the law introduces protective judicial measures, including the possibility for victims to obtain protection orders and access support services. These mechanisms are particularly relevant for children who may be victims of online bullying, grooming, or digital exploitation. The availability of legal remedies and reporting procedures can help ensure that minors affected by digital violence receive timely protection and assistance.<\/p>\n<p>Third, the proposal promotes institutional coordination and awareness-raising efforts among public authorities, law enforcement agencies, and civil society organizations to prevent digital violence. Such preventive measures are essential for protecting children in the digital environment, as they encourage safer online practices, improve reporting mechanisms, and strengthen institutional responses to online harm.<\/p>\n<p>Finally, the law recognizes the broader societal risks associated with the misuse of digital technologies and emphasizes the need for policy responses that safeguard dignity, privacy, and personal security in online spaces. These principles align with international human rights standards, including the Convention on the Rights of the Child, which obliges states to protect children from all forms of violence, exploitation, and abuse, including those occurring through digital technologies.<\/p>\n<p>Taken together, while the draft law is primarily framed around the protection of women, its provisions contribute to the broader objective of protecting vulnerable groups, including children, from technology-facilitated violence, thereby supporting the development of a safer and more accountable digital environment in Lebanon.<\/p>\n<h2>10.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Strengthening the Legal Framework for Artificial Intelligence Regulation in Lebanon<\/h2>\n<p>Recent legislative and policy developments in Lebanon indicate growing recognition among policymakers of the need to regulate emerging digital technologies, particularly artificial intelligence, and to address their implications for human rights, privacy, digital governance, and the protection of children in online environments. As digital platforms, algorithmic systems, and generative artificial intelligence tools increasingly shape communication, economic activity, and public administration, Lebanese authorities have begun exploring institutional and legislative mechanisms capable of governing these technologies.<\/p>\n<p>Taken together, these initiatives illustrate the gradual emergence of a national regulatory framework addressing digital harms, artificial intelligence governance, and the broader transformation of Lebanon\u2019s digital ecosystem. However, the current landscape remains fragmented, and the effectiveness of these initiatives will depend on the development of coherent legal safeguards aligned with international human rights standards.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Emerging Policy Frameworks for Digital Governance<\/h3>\n<p>In February 2026, the Lebanese Council of Ministers adopted Decision No. 13 (Minutes No. 52) establishing an inter-ministerial committee tasked with developing a comprehensive national strategy to guide and regulate children\u2019s use of the internet and digital applications. The initiative, proposed by the Ministry of Information, reflects growing concern regarding the impact of digital technologies on children and the risks associated with online environments.<\/p>\n<p>The Cabinet decision acknowledges that although several Lebanese laws provide partial protections relevant to digital activities, Lebanon currently lacks a comprehensive regulatory framework governing children\u2019s use of digital platforms and emerging technologies.<\/p>\n<p>Existing legal instruments referenced in the decision include:<\/p>\n<ul>\n<li>The Lebanese Penal Code (Legislative Decree No. 340 of 1943)<br \/>\n\u2022 The Law on the Protection of Juveniles in Conflict with the Law or at Risk (Law No. 422 of 2002)<br \/>\n\u2022 The Law on the Protection of Women and Other Family Members from Domestic Violence (Law No. 293 of 2014)<br \/>\n\u2022 The Law on Electronic Transactions and Personal Data (Law No. 81 of 2018).<\/li>\n<\/ul>\n<p>While these instruments provide certain safeguards, they were adopted prior to the rapid expansion of digital platforms and artificial intelligence technologies and therefore do not comprehensively address contemporary digital governance challenges.<\/p>\n<p>The Cabinet decision also emphasizes Lebanon\u2019s obligations under the Convention on the Rights of the Child, which requires states to ensure the protection and best interests of children in all environments, including digital spaces.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Legislative Initiatives Addressing Social Media Risks<\/h3>\n<p>Parallel to these policy initiatives, Lebanese lawmakers have introduced legislative proposals aimed at addressing specific digital risks affecting children.<\/p>\n<p>One such proposal submitted on 25 February 2026 by Member of Parliament Tony Frangieh seeks to prohibit the use of social media platforms by minors under the age of fourteen. The proposal requires digital platforms operating in Lebanon to implement mechanisms verifying users\u2019 age and preventing minors from creating accounts.<\/p>\n<p>The draft law also introduces provisions aimed at strengthening the protection of minors\u2019 personal data by prohibiting the commercial exploitation or unauthorized collection of children\u2019s personal information.<\/p>\n<p>The proposal reflects growing concerns regarding the potential harms associated with early exposure to social media platforms, including cyberbullying, exposure to harmful or inappropriate content, and online exploitation.<\/p>\n<p>However, as noted earlier in this report, such restrictions also raise complex policy questions regarding proportionality, effectiveness, and compatibility with international human rights standards relating to freedom of expression and access to information.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Addressing Artificial Intelligence-Generated Harms<\/h3>\n<p>Beyond social media regulation, Lebanese lawmakers have also begun addressing emerging harms associated with artificial intelligence technologies.<\/p>\n<p>A draft law introduced in 2026 titled \u201cDraft Law Criminalizing the Creation, Modification, or Use of Intimate or Indecent Images and Videos Generated or Altered by Artificial Intelligence without the Consent of the Person Concerned\u201d seeks to address the growing problem of synthetic audiovisual manipulation commonly known as deepfakes.<\/p>\n<p>The proposal, introduced by Member of Parliament Anan Abdallah and other members of Parliament, criminalizes the creation, modification, or dissemination of artificial intelligence-generated content depicting individuals in intimate or degrading situations without their explicit consent.<\/p>\n<p>The proposed legislation establishes liability for individuals who produce, distribute, or facilitate the circulation of such manipulated content through digital platforms. Importantly, it also extends liability to developers or providers of artificial intelligence tools that are intentionally designed for abusive purposes or knowingly used to produce harmful content.<\/p>\n<p>The draft law introduces penalties including imprisonment, financial fines, and the confiscation of devices or software used to commit the offence. It also establishes aggravated sanctions in cases where the victim is a minor or where manipulated content is widely disseminated through digital platforms, recognizing the heightened harm that such acts may cause to vulnerable individuals.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Institutional Development of Artificial Intelligence Governance<\/h3>\n<p>Alongside legislative efforts addressing specific digital harms, Lebanon has also begun considering broader institutional mechanisms for governing artificial intelligence technologies.<\/p>\n<p>In September 2025, the Lebanese Council of Ministers approved a draft law establishing the Ministry of Information Technology and Artificial Intelligence (MITAI). The initiative seeks to transform the existing State Ministry for Technology and Artificial Intelligence into a fully-fledged ministry responsible for coordinating Lebanon\u2019s national digital transformation strategy.<\/p>\n<p>The initiative, led by Minister Kamal Shehadeh, aims to strengthen Lebanon\u2019s digital infrastructure, develop regulatory frameworks governing artificial intelligence technologies, and promote technological innovation across both public administration and the private sector.<\/p>\n<p>According to the draft law transmitted to Parliament pursuant to Decree No. 12867 of 19 September 2025, the proposed ministry would be responsible for:<\/p>\n<ul>\n<li>developing national strategies for digital technologies and artificial intelligence<br \/>\n\u2022 supervising the national digital ecosystem<br \/>\n\u2022 strengthening cybersecurity policies<br \/>\n\u2022 protecting personal data<br \/>\n\u2022 supporting digital innovation and entrepreneurship.<\/li>\n<\/ul>\n<p>The draft legislation also envisages the creation of specialized directorates responsible for implementing national digital governance policies, including:<\/p>\n<ul>\n<li>a Directorate for Technology and Artificial Intelligence<br \/>\n\u2022 a Directorate for Cybersecurity and Data Protection<br \/>\n\u2022 a Directorate for the Digital Economy and Entrepreneurship.<\/li>\n<\/ul>\n<p>If adopted, the creation of MITAI would represent a major institutional step toward consolidating digital governance responsibilities within a dedicated governmental authority.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Proposal for a National Artificial Intelligence Authority<\/h3>\n<p>In parallel with these institutional initiatives, Lebanese lawmakers have proposed the establishment of an independent regulatory authority dedicated to artificial intelligence governance.<\/p>\n<p>On 4 June 2025, Members of Parliament Edgar Traboulsi, Gebran Bassil, Georges Atallah, Cesar Abi Khalil, Nicolas Sehnaoui, and Jimmy Jabbour submitted a Draft Law on the Establishment of the National Artificial Intelligence Authority.<\/p>\n<p>The proposed authority would function as an independent national body responsible for developing and overseeing Lebanon\u2019s national strategy for artificial intelligence.<\/p>\n<p>According to the draft law, the authority would be tasked with:<\/p>\n<ul>\n<li>preparing the national strategy for the artificial intelligence sector<br \/>\n\u2022 proposing regulatory frameworks governing AI technologies<br \/>\n\u2022 monitoring the implementation of AI policies<br \/>\n\u2022 supervising the ethical and responsible use of artificial intelligence<br \/>\n\u2022 proposing legislative reforms where necessary.<\/li>\n<\/ul>\n<p>The authority would also report periodically to the Council of Ministers through the Minister of Telecommunications.<\/p>\n<p>The proposed institutional structure includes representatives from government institutions, the information technology and artificial intelligence sector, and civil society organizations specializing in digital technologies.<\/p>\n<p>The draft law emphasizes the importance of aligning artificial intelligence development with Lebanon\u2019s broader legislative and public policy objectives, including transparency, access to information, technological innovation, and responsible governance.<\/p>\n<p>It also highlights the need to strengthen national capacities in education, research, and technological development to ensure that Lebanon can benefit from the opportunities presented by artificial intelligence while mitigating potential risks to society and fundamental rights.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Civil Society Concerns and Digital Sovereignty<\/h3>\n<p>Alongside governmental initiatives, Lebanese civil society organizations have expressed concerns regarding the governance implications of ongoing digital transformation efforts.<\/p>\n<p>The digital rights organization SMEX (Social Media Exchange) has closely monitored developments relating to artificial intelligence governance and digital infrastructure in Lebanon. Civil society organizations have raised concerns regarding transparency, data protection, and the risks associated with reliance on foreign technology providers.<\/p>\n<p>In particular, SMEX has warned that reliance on external technology providers for national digital infrastructure may pose risks to Lebanon\u2019s digital sovereignty and the protection of citizens\u2019 personal data.<\/p>\n<p>These concerns were amplified following the announcement in December 2025 of an agreement with the technology company Oracle to provide artificial intelligence training to approximately 50,000 participants in Lebanon. While the government emphasized that the agreement does not grant foreign companies access to public sector data, civil society organizations expressed concern that insufficient regulatory safeguards could expose sensitive information to external actors.<\/p>\n<p>Observers have also highlighted shortcomings in Lebanon\u2019s existing data protection framework. Although Law No. 81 of 10 October 2018 on Electronic Transactions and Personal Data provides a legal foundation for regulating personal data processing, it has not yet been fully implemented and lacks strong independent oversight mechanisms.<\/p>\n<p>Civil society organizations have therefore called for stronger institutional safeguards to ensure transparency in digital governance, accountability in public-private technology partnerships, and effective protection of personal data.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 10.7.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Toward a Rights-Based Artificial Intelligence Governance Framework<\/h3>\n<p>Taken together, these initiatives demonstrate that Lebanon has begun to develop an institutional and legislative framework addressing the challenges posed by artificial intelligence and digital technologies.<\/p>\n<p>However, the current regulatory landscape remains fragmented and requires further consolidation in order to ensure coherent governance.<\/p>\n<p>Developing an effective regulatory framework for artificial intelligence will require addressing several key issues, including:<\/p>\n<ul>\n<li>protection of privacy and personal data<br \/>\n\u2022 accountability for algorithmic decision-making systems<br \/>\n\u2022 safeguards against digital harassment and AI-generated manipulation<br \/>\n\u2022 protection of children in digital environments<br \/>\n\u2022 transparency and oversight in the use of AI technologies by public authorities.<\/li>\n<\/ul>\n<p>Ensuring that digital governance frameworks incorporate these safeguards will be critical to ensuring that technological innovation in Lebanon develops in a manner consistent with human rights, democratic governance, and the protection of vulnerable individuals in the digital age.<\/p>\n<p>&nbsp;<\/p>\n<h2>11.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations and Outcomes<\/h2>\n<p>&nbsp;<\/p>\n<p>In light of the findings of this report, the Lebanese National Human Rights Commission, including the Committee for the Prevention of Torture, should call for a coordinated, rights-based national approach to digital governance that protects children, safeguards privacy, strengthens accountability for digital platforms and artificial intelligence systems, and aligns Lebanon\u2019s laws and institutions with international human rights standards. The recommendations below are directed to the Lebanese Government, the Lebanese Parliament, relevant ministries and public authorities, civil society organizations, and United Nations agencies and treaty bodies.<\/p>\n<p>The NHRC-CPT should position itself as a central independent actor in Lebanon\u2019s emerging digital rights framework. It should issue formal opinions on draft laws, monitor their human rights impact, engage with ministries and Parliament, and advocate for child-sensitive, privacy-respecting, and rights-based digital regulation. It should also explore mechanisms for receiving and documenting complaints related to digital harms affecting children and other vulnerable groups.<\/p>\n<p>Through this role, the NHRC-CPT can help ensure that Lebanon\u2019s response to social media risks, cybercrime, and artificial intelligence is not driven solely by security or moral panic, but by principled adherence to human dignity, the rule of law, and international human rights obligations.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations to the Lebanese Government<\/h3>\n<p>The Government of Lebanon should adopt a whole-of-government national strategy on children\u2019s rights in the digital environment, ensuring that all regulatory initiatives affecting children\u2019s access to digital technologies are guided by the best interests of the child, the principles of legality, necessity, and proportionality, and the obligation to protect children while preserving their rights to expression, information, education, participation, and privacy.<\/p>\n<p>The Council of Ministers should ensure that the inter-ministerial committee established by Decision No. 13 of 26 February 2026 operates transparently, includes meaningful consultation with children, parents, teachers, child protection specialists, digital rights experts, and civil society organizations, and produces a public national strategy with clear objectives, timelines, and institutional responsibilities. The Government should also ensure that the National Human Rights Commission plays a substantive and independent oversight role in this process.<\/p>\n<p>The Government should refrain from adopting blanket digital restrictions affecting children unless it can demonstrate that such measures are strictly necessary, proportionate, evidence-based, and accompanied by robust safeguards for children\u2019s rights. Instead, priority should be given to regulatory measures targeting the structural drivers of online harm, including unsafe platform design, opaque algorithmic systems, exploitative data practices, and weak complaint and remedy mechanisms.<\/p>\n<p>The Government should initiate the accession process to the United Nations Convention against Cybercrime, while ensuring that any implementing measures fully comply with international human rights law, particularly protections for privacy, freedom of expression, due process, and judicial oversight. It should also update domestic legislation to regulate cybercrime investigations, cross-border electronic evidence, and digital procedural safeguards in a manner consistent with human rights standards.<\/p>\n<p>The Government should accelerate the implementation and reform of Law No. 81\/2018 on Electronic Transactions and Personal Data by establishing effective enforcement mechanisms and independent oversight for personal data protection, particularly in relation to children\u2019s data, biometric data, and age verification systems.<\/p>\n<p>In the area of artificial intelligence, the Government should ensure that any future ministry, authority, or regulatory mechanism tasked with AI governance is established on the basis of independence, transparency, public accountability, and human rights compliance. AI governance should include mandatory safeguards against discrimination, unlawful surveillance, privacy violations, and AI-generated harms such as non-consensual deepfakes and manipulative synthetic media.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Expected outcomes from government action<\/h3>\n<ul>\n<li>Adopting a comprehensive national strategy on children\u2019s rights in the digital environment based on international human rights standards, ensuring the protection of children while safeguarding their fundamental rights to expression, participation, access to information, and privacy.<\/li>\n<li>Developing a coherent national framework for digital governance that coordinates policies related to online child protection, regulation of digital platforms, cybercrime prevention, personal data protection, and the governance of artificial intelligence.<\/li>\n<li>Strengthening Lebanon\u2019s capacity to investigate and prosecute digital crimes, particularly those targeting children, by aligning national legislation with international standards and enhancing international cooperation mechanisms related to digital evidence.<\/li>\n<li>Establishing an effective personal data protection system that includes clear enforcement mechanisms and independent oversight, with particular attention to the protection of children\u2019s data, biometric data, and age-verification systems.<\/li>\n<li>Adopting a human rights\u2013based regulatory framework for artificial intelligence that ensures transparency and accountability and introduces safeguards to prevent discrimination, unlawful surveillance, and manipulation of digital content.<\/li>\n<li>Enhancing accountability and transparency in the operations of digital platforms and technology companies, including the adoption of rules related to safer platform design, limitations on targeted advertising to children, and strengthened complaint and remedy mechanisms.<\/li>\n<li>Supporting initiatives on digital literacy and empowerment for children, parents, and teachers to raise awareness of digital risks and promote the safe and responsible use of technology.<\/li>\n<li>Strengthening institutional coordination and cooperation among government entities, civil society, the private sector, and international organizations to develop balanced digital policies that protect rights while supporting technological innovation in Lebanon.<\/li>\n<\/ul>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations to the Lebanese Parliament<\/h3>\n<p>The Lebanese Parliament should review all draft laws relating to social media, cybercrime, artificial intelligence, and digital governance through a human rights lens. In particular, Parliament should subject the draft law prohibiting social media use by minors under fourteen to careful scrutiny in light of the Convention on the Rights of the Child, General Comment No. 25, and Article 19 of the International Covenant on Civil and Political Rights.<\/p>\n<p>Parliament should amend any proposal relying on intrusive age verification or broad platform sanctions unless such measures are narrowly tailored, privacy-preserving, and subject to independent oversight. Instead of relying primarily on prohibition, Parliament should legislate for age-appropriate design, stronger data protection for children, restrictions on targeted advertising to minors, and clear obligations on platforms to assess and mitigate risks to children\u2019s rights.<\/p>\n<p>Parliament should adopt a modern and comprehensive legal framework for artificial intelligence that clearly regulates public and private uses of AI, provides remedies for victims of AI-generated harms, requires transparency and human rights due diligence, and establishes accountability for developers, deployers, and intermediaries.<\/p>\n<p>Parliament should also consider the creation of an independent digital rights or data protection authority, or ensure that any proposed National Artificial Intelligence Authority has sufficient independence, expertise, and oversight powers, including the authority to receive complaints and investigate violations.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Expected outcomes from parliamentary action<\/h3>\n<ul>\n<li>Ensuring that all legislation related to the digital environment, including the regulation of social media, cybercrime, and artificial intelligence, complies with international human rights standards, particularly the Convention on the Rights of the Child and the International Covenant on Civil and Political Rights.<\/li>\n<li>Developing a balanced legislative framework that protects children from digital risks without imposing disproportionate restrictions on their rights to freedom of expression, access to information, and participation in digital life.<\/li>\n<li>Strengthening the protection of children\u2019s personal data by establishing clear rules regarding data collection, processing, and use, and limiting targeted advertising to minors and exploitative digital practices.<\/li>\n<li>Adopting modern legislation regulating artificial intelligence that ensures transparency and accountability and provides remedies for individuals harmed by algorithmic systems or AI-generated content.<\/li>\n<li>Strengthening parliamentary oversight of digital policies by monitoring the implementation of national strategies related to digital governance and ensuring that government initiatives are subject to democratic accountability.<\/li>\n<li>Supporting the establishment of independent bodies for data protection or digital rights with sufficient expertise and authority to oversee compliance with digital laws and investigate violations.<\/li>\n<li>Strengthening public trust in digital policies and technology-related legislation by ensuring transparency, accountability, and respect for fundamental rights in the legislative decision-making process.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations to Ministries and Public Authorities<\/h3>\n<p>The Ministries of Information, Telecommunications, Justice, Social Affairs, Education and Higher Education, Interior and Municipalities, Technology and Artificial Intelligence, and other relevant authorities should coordinate closely to ensure that digital regulation is not approached solely as a technical or security issue, but also as a child protection, privacy, education, and human rights issue.<\/p>\n<p>The Ministry of Education should develop digital literacy and online safety curricula tailored to different age groups, including modules on privacy, cyberbullying, misinformation, consent, online exploitation, and responsible use of artificial intelligence tools. The Ministry of Social Affairs should strengthen psychosocial support and reporting pathways for children affected by online harms. The Ministry of Justice should review procedural laws and criminal legislation to ensure effective remedies and fair processes in digital cases. The Ministry of Telecommunications should ensure that regulatory measures imposed on platforms are lawful, transparent, and rights-compliant.<\/p>\n<p>Public authorities should also publish technology agreements, digital transformation plans, and AI-related partnerships affecting public services, subject to narrow exceptions justified by law, in order to guarantee transparency and public accountability.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Expected outcomes from ministerial action<\/h3>\n<ul>\n<li>Strengthening institutional coordination among ministries and public authorities in regulating the digital space, ensuring a comprehensive approach that considers child protection, privacy, education, and human rights alongside technical and security considerations.<\/li>\n<li>Integrating digital literacy and online safety into national educational curricula, enabling children and young people to acquire the skills necessary for the safe and responsible use of digital technologies and artificial intelligence.<\/li>\n<li>Developing effective reporting and support mechanisms for children affected by digital harm, including psychological and social support services and clear pathways for reporting cyberbullying or online exploitation.<\/li>\n<li>Improving the procedural and legal framework for addressing digital crimes, ensuring the availability of effective remedies for victims and fair and efficient judicial procedures in cases related to the digital environment.<\/li>\n<li>Enhancing transparency and accountability in digital transformation policies and technology partnerships between the public and private sectors, including the publication of agreements and initiatives related to technology and artificial intelligence.<\/li>\n<li>Ensuring that regulatory measures imposed on digital platforms are lawful, clear, and consistent with human rights standards, while strengthening protections for users, particularly children.<\/li>\n<li>Building institutional capacities within ministries and public authorities to address challenges related to digital technologies and artificial intelligence, thereby strengthening the state\u2019s ability to develop balanced and sustainable digital policies.<\/li>\n<li>Strengthening public trust in governmental digital policies through transparency, accountability, and the protection of fundamental rights in the management of Lebanon\u2019s digital transformation.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.7.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations to Civil Society Organizations<\/h3>\n<p>Civil society organizations should continue monitoring legislative and policy developments affecting children\u2019s rights, digital governance, cybercrime regulation, and artificial intelligence in Lebanon. They should engage in evidence-based advocacy, contribute to public consultation processes, and provide independent expertise on privacy, digital rights, child protection, gender-based online violence, and digital sovereignty.<\/p>\n<p>Organizations working with children and families should expand awareness campaigns on children\u2019s rights in the digital environment and create accessible reporting and support mechanisms for those exposed to online harm. Digital rights organizations should continue scrutinizing technology agreements, regulatory proposals, and institutional reforms, including the governance implications of foreign technology partnerships and AI training initiatives.<\/p>\n<p>Civil society should also build coalitions across sectors, including child rights, media freedom, women\u2019s rights, disability rights, education, and technology policy, to ensure that digital governance debates are inclusive and rights-based.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.8.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Expected outcomes from civil society action<\/h3>\n<ul>\n<li>Strengthening independent oversight of digital legislation and policies to ensure their alignment with international human rights standards and the protection of children in the digital environment.<\/li>\n<li>Raising public awareness of children\u2019s rights online and the risks associated with cyberbullying, digital exploitation, privacy violations, and misinformation.<\/li>\n<li>Developing effective reporting and support mechanisms for victims, particularly children and adolescents exposed to online harm or violence.<\/li>\n<li>Enhancing accountability and transparency in digital policies and technology partnerships between the public and private sectors, including agreements related to digital infrastructure or artificial intelligence programs.<\/li>\n<li>Providing independent expertise to decision-makers through research, studies, and public consultations related to digital governance, data protection, and artificial intelligence.<\/li>\n<li>Building broad coalitions among civil society organizations working in the fields of children\u2019s rights, media freedom, women\u2019s rights, education, and technology policy, thereby strengthening a comprehensive approach to digital governance.<\/li>\n<li>Strengthening civil society participation in the development of digital policies to ensure that these policies are inclusive and responsive to human rights considerations and the needs of the most vulnerable groups.<\/li>\n<li>Supporting the development of a safer, more equitable, and more accountable digital environment in Lebanon, enabling society to benefit from technological innovation while reducing its risks to individuals and communities.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.9.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Recommendations to UN Agencies and Treaty Bodies<\/h3>\n<p>UN agencies, including UNICEF, OHCHR, UNESCO, UNDP, and ITU, should provide coordinated technical assistance to Lebanon in the development of a national strategy on children\u2019s rights in the digital environment, digital literacy policies, privacy safeguards, and AI governance frameworks grounded in human rights.<\/p>\n<p>UNICEF should support child-centered policy development and meaningful child participation in digital governance reform. OHCHR should provide guidance on the compatibility of proposed laws and policies with international human rights standards, including on privacy, expression, and the rights of the child. UNESCO should support ethical AI policy development and digital education. UNDP and other partners should assist with institutional capacity-building and regulatory design.<\/p>\n<p>Treaty bodies, especially the Committee on the Rights of the Child and the Human Rights Committee, should continue to address Lebanon\u2019s digital governance obligations in their dialogues and concluding observations, including the regulation of children\u2019s online safety, data protection, cybercrime enforcement, and AI-related harms.<\/p>\n<h3>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 11.10.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Expected outcomes from UN engagement<\/h3>\n<ul>\n<li>Providing technical and methodological support to Lebanon in developing a comprehensive national strategy on children\u2019s rights in the digital environment and advanced policies for digital governance.<\/li>\n<li>Strengthening the alignment of Lebanese legislation and policies with international human rights standards, particularly the Convention on the Rights of the Child and the International Covenant on Civil and Political Rights.<\/li>\n<li>Supporting the development of educational policies on digital literacy and promoting children\u2019s and young people\u2019s skills for the safe and responsible use of technology and artificial intelligence.<\/li>\n<li>Strengthening the institutional capacities of the Lebanese government and national bodies in the areas of data protection, artificial intelligence regulation, and cybercrime prevention.<\/li>\n<li>Promoting the participation of children and young people in the development of digital policies to ensure that these policies reflect their needs and experiences in the digital environment.<\/li>\n<li>Developing ethical and regulatory frameworks for artificial intelligence based on the principles of transparency, accountability, and respect for human rights.<\/li>\n<li>Encouraging international cooperation and the exchange of expertise in the field of digital governance and the protection of children online.<\/li>\n<li>Supporting continuous international monitoring of digital developments in Lebanon through international treaty mechanisms, thereby strengthening compliance with Lebanon\u2019s human rights obligations.<\/li>\n<\/ul>\n<h1>Footnotes<\/h1>\n<p><a href=\"#_ftnref1\" name=\"_ftn1\"><sup>[1]<\/sup><\/a> Australian Government, Social Media Age Restrictions Legislation, December 2025.<\/p>\n<p><a href=\"#_ftnref2\" name=\"_ftn2\"><sup>[2]<\/sup><\/a> \u200b\u200bPureprofile Research Survey on Public Attitudes Toward Social Media Restrictions, 2025.<\/p>\n<p><a href=\"#_ftnref3\" name=\"_ftn3\"><sup>[3]<\/sup><\/a> Amnesty International Malaysia, \u201cMalaysia: Effectively regulate social media to protect children and young people instead of imposing a blanket ban,\u201d 3 December 2025.<\/p>\n<p><a href=\"#_ftnref4\" name=\"_ftn4\"><sup>[4]<\/sup><\/a> European Union, Digital Services Act (Regulation EU 2022\/2065).<\/p>\n<p><a href=\"#_ftnref5\" name=\"_ftn5\"><sup>[5]<\/sup><\/a> Convention on the Rights of the Child, Articles 13, 16, and 17.<\/p>\n<p><a href=\"#_ftnref6\" name=\"_ftn6\"><sup>[6]<\/sup><\/a> International Covenant on Civil and Political Rights, Article 19.<\/p>\n<p><a href=\"#_ftnref7\" name=\"_ftn7\"><sup>[7]<\/sup><\/a> UN Committee on the Rights of the Child, General Comment No. 25 (2021) on children\u2019s rights in relation to the digital environment.<\/p>\n<p><a href=\"#_ftnref8\" name=\"_ftn8\"><sup>[8]<\/sup><\/a> Amnesty International, \u201cSurveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights,\u201d 2019.<\/p>\n<p><a href=\"#_ftnref9\" name=\"_ftn9\"><sup>[9]<\/sup><\/a> UN Committee on the Rights of the Child, General comment No. 25 (2021) on children\u2019s rights in relation to the digital environment, UN Doc. CRC\/C\/GC\/25, 2 March 2021. <a href=\"https:\/\/www.ohchr.org\/en\/documents\/general-comments-and-recommendations\/general-comment-no-25-2021-childrens-rights-relation\">https:\/\/www.ohchr.org\/en\/documents\/general-comments-and-recommendations\/general-comment-no-25-2021-childrens-rights-relation<\/a><\/p>\n<hr \/>\n<p><strong>Title:<\/strong> Protecting Children in the Digital Environment\u2502Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon<\/p>\n<p><strong>Publisher:<\/strong> The Lebanese Republic | The National Human Rights Commission, which includs the Committee for the Prevention of Torture (NHRC-CPT)<\/p>\n<p><strong>Author:<\/strong> Bassam Alkantar \u2502Commissioner for International Relations and Information at NHRC-CPT.<\/p>\n<p><strong>First Edition: <\/strong>2026<\/p>\n<p><strong>\ud83d\udccd\u00a0 Address: <\/strong>Serhal Building, First Floor, Sami El Solh Boulevard, Beirut, Lebanon.<\/p>\n<p><strong>\ud83d\udce7 Email: <\/strong>info@nhrclb.org<\/p>\n<p><strong>\ud83c\udf10Website:<\/strong> <a href=\"https:\/\/nhrclb.org\">https:\/\/nhrclb.org<\/a><\/p>\n<p><strong>\u260e\ufe0fHotline:<\/strong> +961 3 923 456<\/p>\n<p><strong>Facebook:<\/strong> <a href=\"http:\/\/fb.nhrclb.org\">http:\/\/fb.nhrclb.org<\/a><\/p>\n<p><strong>X:<\/strong> <a href=\"http:\/\/twitter.nhrclb.org\">http:\/\/twitter.nhrclb.org<\/a><\/p>\n<p><strong>Instagram:<\/strong> <a href=\"http:\/\/insta.nhrclb.org\">http:\/\/insta.nhrclb.org<\/a><\/p>\n<p><strong>YouTube:<\/strong> <a href=\"http:\/\/yt.nhrclb.org\">http:\/\/yt.nhrclb.org<\/a><\/p>\n<p><strong>Flickr:<\/strong> <a href=\"https:\/\/www.flickr.com\/photos\/145354751@N08\/\">https:\/\/www.flickr.com\/photos\/145354751@N08\/<\/a><\/p>\n<p><strong>Bluesky: <\/strong><a href=\"https:\/\/bsky.app\/profile\/nhrclb.bsky.social\">https:\/\/bsky.app\/profile\/nhrclb.bsky.social<\/a><\/p>\n<p><strong>Tumblr:<\/strong> <a href=\"https:\/\/www.tumblr.com\/nhrclb\">https:\/\/www.tumblr.com\/nhrclb<\/a><\/p>\n<p><strong>Mastodon:<\/strong> <a href=\"https:\/\/mastodon.social\/@nhrclb\">https:\/\/mastodon.social\/@nhrclb<\/a><\/p>\n<p><strong>LinkedIn:<\/strong> <a href=\"https:\/\/www.linkedin.com\/company\/nhrclb\/\">https:\/\/www.linkedin.com\/company\/nhrclb\/<\/a><\/p>\n<p><strong>Threads:<\/strong> <a href=\"https:\/\/www.threads.com\/@nhrc_lb\">https:\/\/www.threads.com\/@nhrc_lb<\/a><\/p>\n<hr \/>\n<p>Some Rights Reserved (CC), National Human Rights Commission, including the Committee for the Prevention of Torture \u2013 Lebanon, 2026.<\/p>\n<p>The views expressed in this report are those of the National Human Rights Commission, including the Committee for the Prevention of Torture, and do not necessarily reflect the views of any parties mentioned in the report or of any past or current partners.<\/p>\n<p>This document is available under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND 4.0).<\/p>\n<p>Reproduction, storage in a retrieval system, or transmission of this book in any form or by any means\u2014electronic, mechanical, photocopying, recording, or otherwise\u2014for commercial purposes is strictly prohibited without prior written permission from the publisher.<\/p>\n<p>For more information, please visit the copyright page on the Commission\u2019s website:<\/p>\n<p><a href=\"https:\/\/nhrclb.org\/copyright\">https:\/\/nhrclb.org\/copyright<\/a><\/p>\n<p>Permissions: Requests for commercial use, additional rights, or licensing should be directed to: <a href=\"mailto:info@nhrclb.org\">info@nhrclb.org<\/a><\/p>\n<p>The National Human Rights Commission, which includes the Committee for the Prevention of Torture, works to protect and promote human rights in Lebanon in accordance with the standards set out in the Constitution, the Universal Declaration of Human Rights, relevant international treaties and conventions, and domestic laws aligned with these standards. It is an independent national institution established under Law No. 62\/2016, pursuant to the United Nations General Assembly resolution (Paris Principles) which governs the creation and functioning of national human rights institutions. The Commission also serves as the National Preventive Mechanism against torture, in line with the provisions of the Optional Protocol to the Convention against Torture and Other Cruel, Inhuman or Degrading Treatment or Punishment, which Lebanon acceded to under Law No. 12\/2008.2008.<\/p>\n<hr \/>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Author: Bassam Alkantar Open or Download the PDF File \u00a0\u00a0 1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Executive Summary Digital technologies have become an integral part of the lives of children and adolescents in Lebanon and around the world. Social media platforms now function as key spaces for communication, education, identity formation, entertainment, and civic participation. At the same time, these [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":6947,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[161],"tags":[],"class_list":{"0":"post-6956","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-publications-en"},"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.6 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/nhrclb.org\/en\/archives\/6956\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon\" \/>\n<meta property=\"og:description\" content=\"Author: Bassam Alkantar Open or Download the PDF File \u00a0\u00a0 1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Executive Summary Digital technologies have become an integral part of the lives of children and adolescents in Lebanon and around the world. Social media platforms now function as key spaces for communication, education, identity formation, entertainment, and civic participation. At the same time, these [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/nhrclb.org\/en\/archives\/6956\" \/>\n<meta property=\"og:site_name\" content=\"National Human Rights Commission - Lebanon\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/NHRCLB\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/nhrclb\/\" \/>\n<meta property=\"article:published_time\" content=\"2026-05-12T15:15:07+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-05-12T15:24:14+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1810\" \/>\n\t<meta property=\"og:image:height\" content=\"2560\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"NHRCLB\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@nhrclb\" \/>\n<meta name=\"twitter:site\" content=\"@nhrclb\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"NHRCLB\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"104 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956\"},\"author\":{\"name\":\"NHRCLB\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#\\\/schema\\\/person\\\/bcb35210564d89b3fafa40dbc39320bf\"},\"headline\":\"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon\",\"datePublished\":\"2026-05-12T15:15:07+00:00\",\"dateModified\":\"2026-05-12T15:24:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956\"},\"wordCount\":14203,\"publisher\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2026\\\/05\\\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg\",\"articleSection\":[\"Publications\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956\",\"url\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956\",\"name\":\"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2026\\\/05\\\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg\",\"datePublished\":\"2026-05-12T15:15:07+00:00\",\"dateModified\":\"2026-05-12T15:24:14+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#primaryimage\",\"url\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2026\\\/05\\\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg\",\"contentUrl\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2026\\\/05\\\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg\",\"width\":1810,\"height\":2560},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/6956#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\u0627\u0644\u0631\u0626\u064a\u0633\u064a\u0629\",\"item\":\"https:\\\/\\\/nhrclb.org\\\/en\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#website\",\"url\":\"https:\\\/\\\/nhrclb.org\\\/en\",\"name\":\"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 - \u0644\u0628\u0646\u0627\u0646\",\"description\":\"\u0645\u0624\u0633\u0633\u0629 \u0648\u0637\u0646\u064a\u0629 \u0645\u0633\u062a\u0642\u0644\u0629 \u0644\u062d\u0645\u0627\u064a\u0629 \u0648\u062a\u0639\u0632\u064a\u0632 \u062d\u0642\u0648\u0642 \u0627\u0644\u0627\u0646\u0633\u0627\u0646 \u0648\u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 \u0645\u0646\u0634\u0627\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 62\\\/ 2016\",\"publisher\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#organization\"},\"alternateName\":\"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0627\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0627\u0644\u062c\u0645\u0647\u0648\u0631\u064a\u0629 \u0627\u0644\u0644\u0628\u0646\u0627\u0646\u064a\u0629\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/nhrclb.org\\\/en?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#organization\",\"name\":\"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0644\u0628\u0646\u0627\u0646\",\"url\":\"https:\\\/\\\/nhrclb.org\\\/en\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/NHRC-CPT-Logo.png\",\"contentUrl\":\"https:\\\/\\\/nhrclb.org\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/NHRC-CPT-Logo.png\",\"width\":1017,\"height\":761,\"caption\":\"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0644\u0628\u0646\u0627\u0646\"},\"image\":{\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/NHRCLB\",\"https:\\\/\\\/x.com\\\/nhrclb\",\"https:\\\/\\\/www.instagram.com\\\/nhrc_lb\",\"https:\\\/\\\/www.youtube.com\\\/@lebanonsnhrccpt2364\",\"https:\\\/\\\/www.flickr.com\\\/photos\\\/145354751@N08\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/nhrclb.org\\\/en#\\\/schema\\\/person\\\/bcb35210564d89b3fafa40dbc39320bf\",\"name\":\"NHRCLB\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g\",\"caption\":\"NHRCLB\"},\"description\":\"\u0645\u0624\u0633\u0633\u0629 \u0648\u0637\u0646\u064a\u0629 \u0645\u0633\u062a\u0642\u0644\u0629 \u0645\u0646\u0634\u0623\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 62\\\/ 2016\u060c \u062a\u062a\u0636\u0645\u0646 \u0622\u0644\u064a\u0629 \u0648\u0642\u0627\u0626\u064a\u0629 \u0648\u0637\u0646\u064a\u0629 \u0644\u0644\u062a\u0639\u0630\u064a\u0628 (\u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628) \u0639\u0645\u0644\u0627\u064b \u0628\u0623\u062d\u0643\u0627\u0645 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 \u0631\u0642\u0645 12\\\/ 2008 (\u0627\u0644\u0645\u0635\u0627\u062f\u0642\u0629 \u0639\u0644\u0649 \u0627\u0644\u0628\u0631\u0648\u062a\u0648\u0643\u0648\u0644 \u0627\u0644\u0627\u062e\u062a\u064a\u0627\u0631\u064a \u0644\u0627\u062a\u0641\u0627\u0642\u064a\u0629 \u0645\u0646\u0627\u0647\u0636\u0629 \u0627\u0644\u062a\u0639\u0630\u064a\u0628). An independent national institution established under Law No. 62\\\/2016, which includes a National Preventive Mechanism against torture (the Committee for the Prevention of Torture), in accordance with the provisions of Law No. 12\\\/2008 (ratifying the Optional Protocol to the Convention against Torture). Une institution nationale ind\u00e9pendante \u00e9tablie en vertu de la loi n\u00b0 62\\\/2016, qui comprend un m\u00e9canisme national de pr\u00e9vention de la torture (le Comit\u00e9 pour la pr\u00e9vention de la torture), conform\u00e9ment aux dispositions de la loi n\u00b0 12\\\/2008 (ratifiant le Protocole facultatif se rapportant \u00e0 la Convention contre la torture).\",\"sameAs\":[\"https:\\\/\\\/nhrclb.org\",\"https:\\\/\\\/www.facebook.com\\\/nhrclb\\\/\",\"https:\\\/\\\/www.instagram.com\\\/nhrc_lb\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/nhrclb\",\"https:\\\/\\\/x.com\\\/nhrclb\",\"https:\\\/\\\/www.youtube.com\\\/channel\\\/UCDxi0DJVqYSNfc06GleK8ig\",\"https:\\\/\\\/www.tumblr.com\\\/nhrclb\"],\"url\":\"https:\\\/\\\/nhrclb.org\\\/en\\\/archives\\\/author\\\/admin2024\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/nhrclb.org\/en\/archives\/6956","og_locale":"en_US","og_type":"article","og_title":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon","og_description":"Author: Bassam Alkantar Open or Download the PDF File \u00a0\u00a0 1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Executive Summary Digital technologies have become an integral part of the lives of children and adolescents in Lebanon and around the world. Social media platforms now function as key spaces for communication, education, identity formation, entertainment, and civic participation. At the same time, these [&hellip;]","og_url":"https:\/\/nhrclb.org\/en\/archives\/6956","og_site_name":"National Human Rights Commission - Lebanon","article_publisher":"https:\/\/www.facebook.com\/NHRCLB","article_author":"https:\/\/www.facebook.com\/nhrclb\/","article_published_time":"2026-05-12T15:15:07+00:00","article_modified_time":"2026-05-12T15:24:14+00:00","og_image":[{"width":1810,"height":2560,"url":"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg","type":"image\/jpeg"}],"author":"NHRCLB","twitter_card":"summary_large_image","twitter_creator":"@nhrclb","twitter_site":"@nhrclb","twitter_misc":{"Written by":"NHRCLB","Est. reading time":"104 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/nhrclb.org\/en\/archives\/6956#article","isPartOf":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956"},"author":{"name":"NHRCLB","@id":"https:\/\/nhrclb.org\/en#\/schema\/person\/bcb35210564d89b3fafa40dbc39320bf"},"headline":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon","datePublished":"2026-05-12T15:15:07+00:00","dateModified":"2026-05-12T15:24:14+00:00","mainEntityOfPage":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956"},"wordCount":14203,"publisher":{"@id":"https:\/\/nhrclb.org\/en#organization"},"image":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956#primaryimage"},"thumbnailUrl":"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg","articleSection":["Publications"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/nhrclb.org\/en\/archives\/6956","url":"https:\/\/nhrclb.org\/en\/archives\/6956","name":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon | National Human Rights Commission - Lebanon","isPartOf":{"@id":"https:\/\/nhrclb.org\/en#website"},"primaryImageOfPage":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956#primaryimage"},"image":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956#primaryimage"},"thumbnailUrl":"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg","datePublished":"2026-05-12T15:15:07+00:00","dateModified":"2026-05-12T15:24:14+00:00","breadcrumb":{"@id":"https:\/\/nhrclb.org\/en\/archives\/6956#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/nhrclb.org\/en\/archives\/6956"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/nhrclb.org\/en\/archives\/6956#primaryimage","url":"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg","contentUrl":"https:\/\/nhrclb.org\/wp-content\/uploads\/2026\/05\/NHRC-Publication-Bassam-Alkantar-Protecting-Children-En-scaled.jpg","width":1810,"height":2560},{"@type":"BreadcrumbList","@id":"https:\/\/nhrclb.org\/en\/archives\/6956#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\u0627\u0644\u0631\u0626\u064a\u0633\u064a\u0629","item":"https:\/\/nhrclb.org\/en"},{"@type":"ListItem","position":2,"name":"Protecting Children in the Digital Environment: Social Media Restrictions, Platform Accountability, and Human Rights Implications for Lebanon"}]},{"@type":"WebSite","@id":"https:\/\/nhrclb.org\/en#website","url":"https:\/\/nhrclb.org\/en","name":"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 - \u0644\u0628\u0646\u0627\u0646","description":"\u0645\u0624\u0633\u0633\u0629 \u0648\u0637\u0646\u064a\u0629 \u0645\u0633\u062a\u0642\u0644\u0629 \u0644\u062d\u0645\u0627\u064a\u0629 \u0648\u062a\u0639\u0632\u064a\u0632 \u062d\u0642\u0648\u0642 \u0627\u0644\u0627\u0646\u0633\u0627\u0646 \u0648\u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 \u0645\u0646\u0634\u0627\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 62\/ 2016","publisher":{"@id":"https:\/\/nhrclb.org\/en#organization"},"alternateName":"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0627\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0627\u0644\u062c\u0645\u0647\u0648\u0631\u064a\u0629 \u0627\u0644\u0644\u0628\u0646\u0627\u0646\u064a\u0629","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/nhrclb.org\/en?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/nhrclb.org\/en#organization","name":"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0644\u0628\u0646\u0627\u0646","url":"https:\/\/nhrclb.org\/en","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/nhrclb.org\/en#\/schema\/logo\/image\/","url":"https:\/\/nhrclb.org\/wp-content\/uploads\/2024\/01\/NHRC-CPT-Logo.png","contentUrl":"https:\/\/nhrclb.org\/wp-content\/uploads\/2024\/01\/NHRC-CPT-Logo.png","width":1017,"height":761,"caption":"\u0627\u0644\u0647\u064a\u0626\u0629 \u0627\u0644\u0648\u0637\u0646\u064a\u0629 \u0644\u062d\u0642\u0648\u0642 \u0627\u0644\u0625\u0646\u0633\u0627\u0646 \u0627\u0644\u0645\u062a\u0636\u0645\u0646\u0629 \u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628 - \u0644\u0628\u0646\u0627\u0646"},"image":{"@id":"https:\/\/nhrclb.org\/en#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/NHRCLB","https:\/\/x.com\/nhrclb","https:\/\/www.instagram.com\/nhrc_lb","https:\/\/www.youtube.com\/@lebanonsnhrccpt2364","https:\/\/www.flickr.com\/photos\/145354751@N08"]},{"@type":"Person","@id":"https:\/\/nhrclb.org\/en#\/schema\/person\/bcb35210564d89b3fafa40dbc39320bf","name":"NHRCLB","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/df8370662af0ee03068dbccb0dedb52c94e2b06a615ffac8e71258ebe6faa37d?s=96&d=mm&r=g","caption":"NHRCLB"},"description":"\u0645\u0624\u0633\u0633\u0629 \u0648\u0637\u0646\u064a\u0629 \u0645\u0633\u062a\u0642\u0644\u0629 \u0645\u0646\u0634\u0623\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 62\/ 2016\u060c \u062a\u062a\u0636\u0645\u0646 \u0622\u0644\u064a\u0629 \u0648\u0642\u0627\u0626\u064a\u0629 \u0648\u0637\u0646\u064a\u0629 \u0644\u0644\u062a\u0639\u0630\u064a\u0628 (\u0644\u062c\u0646\u0629 \u0627\u0644\u0648\u0642\u0627\u064a\u0629 \u0645\u0646 \u0627\u0644\u062a\u0639\u0630\u064a\u0628) \u0639\u0645\u0644\u0627\u064b \u0628\u0623\u062d\u0643\u0627\u0645 \u0627\u0644\u0642\u0627\u0646\u0648\u0646 \u0631\u0642\u0645 12\/ 2008 (\u0627\u0644\u0645\u0635\u0627\u062f\u0642\u0629 \u0639\u0644\u0649 \u0627\u0644\u0628\u0631\u0648\u062a\u0648\u0643\u0648\u0644 \u0627\u0644\u0627\u062e\u062a\u064a\u0627\u0631\u064a \u0644\u0627\u062a\u0641\u0627\u0642\u064a\u0629 \u0645\u0646\u0627\u0647\u0636\u0629 \u0627\u0644\u062a\u0639\u0630\u064a\u0628). An independent national institution established under Law No. 62\/2016, which includes a National Preventive Mechanism against torture (the Committee for the Prevention of Torture), in accordance with the provisions of Law No. 12\/2008 (ratifying the Optional Protocol to the Convention against Torture). Une institution nationale ind\u00e9pendante \u00e9tablie en vertu de la loi n\u00b0 62\/2016, qui comprend un m\u00e9canisme national de pr\u00e9vention de la torture (le Comit\u00e9 pour la pr\u00e9vention de la torture), conform\u00e9ment aux dispositions de la loi n\u00b0 12\/2008 (ratifiant le Protocole facultatif se rapportant \u00e0 la Convention contre la torture).","sameAs":["https:\/\/nhrclb.org","https:\/\/www.facebook.com\/nhrclb\/","https:\/\/www.instagram.com\/nhrc_lb","https:\/\/www.linkedin.com\/company\/nhrclb","https:\/\/x.com\/nhrclb","https:\/\/www.youtube.com\/channel\/UCDxi0DJVqYSNfc06GleK8ig","https:\/\/www.tumblr.com\/nhrclb"],"url":"https:\/\/nhrclb.org\/en\/archives\/author\/admin2024"}]}},"jetpack_publicize_connections":[],"_links":{"self":[{"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/posts\/6956","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/comments?post=6956"}],"version-history":[{"count":4,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/posts\/6956\/revisions"}],"predecessor-version":[{"id":6963,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/posts\/6956\/revisions\/6963"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/media\/6947"}],"wp:attachment":[{"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/media?parent=6956"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/categories?post=6956"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/nhrclb.org\/en\/wp-json\/wp\/v2\/tags?post=6956"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}