Recent news coverage explored the use of algorithms to identify vulnerable children. It was suggested that in an era of cuts it may offer a tool to identify children who might be at risk of ‘abuse, youth offending and truancy’.
Although no-one would deny the importance of protecting children, one of the major concerns about using data to identify risk – is that it de-skills all of us from talking about it.
It is not that technology cannot help – but rather that technology must sit alongside additional efforts to enable children to be able to be part of a discourse about who they are and how they feel. It takes us back to the points in last week’s blog, where the importance of belonging was highlighted alongside the need for children to have a language so they can involve themselves in the conversation.
Abuse, offending behaviour and truancy are all issues where we will be better able to protect children if we can support them to develop the skills and the language to speak out.
Promoting an environment where the voice of the child is valued must be an ambition of any who wish to protect the child! Algorithms could help to get a conversation started, but without us investing in the ability of both adults and children to converse meaningfully – our ability to effectively protect children will always be limited.