Article ID Journal Published Year Pages File Type
345831 Children and Youth Services Review 2016 7 Pages PDF
Abstract

•A balance between protection and freedom must be sought throughout the monitoring.•Adolescents want help with uncontrollable events (i.e. they cannot solve) on SNSs.•Priorities for detection include public hate pages, fake accounts and nude pictures.•Monitoring actions to ensure prevention of harm are favoured.•Monitoring must include consulting victims and sanctioning perpetrators.

Automatic monitoring of user-generated content on social networking sites (SNSs) aims at detecting potential harm for adolescents by means of text and image mining techniques and subsequent actions by the providers (e.g. blocking users, legal action). Evidently, current research is primarily focused on its technological development. However, involving adolescents' voices regarding the desirability of this monitoring is important; particularly because automatic monitoring might invade adolescents' privacy and freedom, and consequently evoke reactance. In this study, fourteen focus groups were conducted with adolescents (N = 66) between 12 and 18 years old. The goal was to obtain insights into adolescents' opinions on desirability and priorities for automatically detecting harmful content on SNSs. Opinions reflect the contention between a need for protection online versus the preservation of freedom. Most adolescents in this study are in favour of automatic monitoring for situations they perceive as uncontrollable or that they cannot solve themselves. Clear priorities for detection must be set in order to ensure the privacy and autonomy of adolescents. Moreover, monitoring actions aiming at the prevention of harm are required.

Related Topics
Health Sciences Medicine and Dentistry Perinatology, Pediatrics and Child Health
Authors
, , ,