Twitter is lagging behind other social networks when it comes to removing users that post child abuse images.

According to the BBC, the Child Exploitation and Online Protection Centre (CEOP) has urged the micro-blogging site to investigate whether paedophiles use the site to discuss abuse and link to pornographic images.

The reported cited former detective and now a child-protection expert Mark Williams-Thomas, who claims despite being reported as being in breach of Twitter's child protection policy, some users some users remain active on the micro-blogging site for as long as weeks after the report has been filed.

"There is always going to be a problem with social networking sites, because where there is an opportunity offenders will seek that out," he said.

"Clearly what Twitter needs to do is to take responsibility for its users. And when they identify there is somebody promoting child abuse material, swapping it or even discussing it, the site must come down straight away."

Twitter said it deals with report over inappropriate material "immediately".

"When we receive a report and identify it as valid, we take action immediately," said Del Harvey, Twitter's Director of Trust and Safety.

"Accounts being reported may be the subject of law enforcement investigations. In those instances, while the profiles are certainly disturbing, removing them immediately can actually harm the cases that law enforcement may be attempting to build."

While rival social networks such as Bebo and Facebook have installed so-called 'Panic Buttons' that allow users to flag up inappropriate content and get access to advice on how to stay safe online, Twitter currently requires users to scour the site for an email address where they can report concerns over users and the material they have posted.

"They [Twitter] are a little bit behind some other sites that have been around a little bit longer," said Peter Davies, the chief executive of CEOP.

Christian Sjoberg, CEO at NetClean, a firm that offers technical solutions for preventing the spread of child sexual abuse content, said web users need to remember that Twitter and other social networks are not at fault because paedophiles are increasingly using sites to share and view illegal child sexual abuse content.

"Perpetrators are becoming evermore sophisticated in their methods, and many organisations are still unaware that sharing, downloading and viewing of this content is taking place within their workplace, on their own corporate networks," said Sjoberg.

"Child sexual abuse is a problem that exists in societies across the world, not just on the internet. Therefore the onus should not be on companies to address the problem, but countries, their governments and Police forces, must all show commitment to taking this global issue seriously, if we are to win the battle against child sexual abuse."