The relevant research and experiences from around the world, primarily from the European countries in which the members of the EDRI association (European Digital Rights) are operating, including Macedonia, other European countries including Turkey and Belarus, as well as the United States, Venezuela, Australia, Saudi Arabia, Iran and China, indicate that the direct interference of the state in the regulation of access to digital content, regardless of the excuse – even in societies with a developed democracy, leads to misuse of the established mechanisms in order to conduct: censorship, excessive monitoring, misuse for ideological goals, politicized repression and in order to imperil fundamental human rights.

Furthermore, the established blocking mechanisms mostly prove to be ineffective in achieving the projected goals for child protection and are actually a waste of limited budget resources.

All forms of risk prevention must be based on the balancing of needs for protection from criminal acts with an actual understanding of the advantages arising from new technologies, and on the establishment of mechanisms for transparency and accountability with which citizens will have the ability to directly prevent or mitigate the consequences from misuse by the competent institutions.

Therefore, in the process of protecting children from unwanted consequences from the use of new technologies, it would be best if the state focuses its energy on:

  • Strengthening the capacities of the educational system for transfer of knowledge and skills to the pupils, students, teachers and pedagogues for dealing with risks, by adapting the curriculums to today’s needs.
  • Strengthening the public awareness of the population about the importance of the protection of privacy and security when it comes to new media, especially among parents and guardians.
  • Strengthening the capacity of the social protection system for prevention and management of risks associated with new technologies.
  • Funding comprehensive research studies in order to determine the exact level of risks and the most convenient methods for dealing with them.
  • Providing the space and environment for public policy creation through an inclusive public debate of all stakeholders.

The development of mechanisms for content categorization and labeling should remain on the level of:

  • Self-regulation by the manufacturers and distributors of such content, including computer games, and
  • Self-regulation by the users themselves, such as using individual filtering systems in the family home, selected by the parents or guardians themselves, rather than being imposed by laws or regulations.

In all its activities, the state must take into account that although on an international level the free access to the internet and other new technologies is still not considered to be a fundamental human right (with the exception of pioneers France and Finland), it is nevertheless an extremely important right, which is being constantly reaffirmed in the EU, inter alia, with the European Parliament’s Resolution on Cultural Industries in Europe which outlines that the Internet is a “wide platform for cultural expression, access to knowledge and democratic participation in the European creative sphere, bringing together different generations in the information society” and is protected with the right to freedom of expression.

The study on the effectiveness of filtering measures ordered by the Government of Australia has identified the following shortcomings of this approach:

  • All filtering systems can be bypassed by using easily available software.
  • The censors who are updating blacklists will not be able to successfully cope with the amount of new content published on the Internet every second.
  • Filters using real-time analysis to determine whether the content is inappropriate are not effective, they can prevent access to necessary content, they are easy to bypass and reduce the network speed exponentially compared to their accuracy.
  • Entire sites that contain user generated content, such as YouTube or Wikipedia could be blocked for having only one inappropriate video or article.
  • Filters would be very expensive and difficult to maintain by ISPs, resulting with many small providers being forced to give up their business.
  • Blacklists prepared by the authorities can easily emerge in public, because unlike end users who will not have access to them, all of the ISP’s employees must have access to these lists.
  • Government’s filters cannot censor the content of the peer-to-peer networks such as Limewire, chat systems, e-mail and instant messaging systems (as well as torrent files).
  • Both the providers and the Government could bear the legal responsibility for any shortcomings in the scheme, especially if the e-publishers have no right to challenge the unnecessary blocking.

Metamorphosis Foundation

Share: