Microsoft has introduced the cloud version of its PhotoDNA service, which will allow photo sharing service providers and social networking sites to identify and remove child porn images.

The software maker said that out of the 1.8 billion photos uploaded on a daily basis, 720,000 are illegal images, which has become a major issue for photo sharing services.

Microsoft collaborated with Dartmouth College to create the on-premise version of the service in 2009, to address the problems faced by National Center of Missing and Exploited Children (NCMEC).

It was later adopted by tech giants like Facebook and Twitter along with 70 other companies across the globe.

NCMEC Exploited Child Division vice president John Shehan said: "Certainly, it’s important from a victims’ rights perspective; these are crime scene photographs.

"Microsoft providing this service is immense."

According to Microsoft, the cloud version of PhotoDNA is much faster than its predecessor, and it does not require much time, money and technical expertise to use, making it a better choice for smaller companies and other organisations.

The technology works by converting images into a greyscale format, and it detects photos that have been edited and re-uploaded.

Microsoft Digital Crimes Unit senior attorney Courtney Gregoire added: "Finding these knownchild sex abuse images in that huge universe is like finding a needle in a haystack.

"We needed an easier, more scalable way to identify and detect these worst of the worst images … and that’s how the concept for PhotoDNA in the cloud was born."