Channel

Ecosystem

machine learning

Facebook’s Outage Reveals How AI Tags Photos

“One person, beard” – “3 people smiling, standing, indoor” – “6 people, indoor” – “child, closeup, indoor”. Instead of photos, tens of thousands of Facebook users got these texts displayed on Wednesday. The reason: in one of the databases there were problems with an automated system, which is responsible for configuration values ​​and was unable to perform its task correctly due to an error (more here).

The result: users of Facebook, Instagram and WhatsApp could no longer see photos. The downtime lasted around two and a half hours and hit mainly users in the US and Europe. Instead of the images now empty fields with short texts were displayed. And these picture descriptions have it all. They show how Facebook uses Artificial Intelligence to mark images of users.

Since 2016, Facebook uses machine learning to read the content of photos. It’s been a long time since users have the feature that allows them to actively tag objects, places and people seen in the photos. But with its ML algorithms, Facebook also succeeds in recognizing image content that users do not actively describe themselves.

Purpose: The automatically generated image descriptions should give users with visual impairments or blind people the opportunity to actively use the platform. So-called “screen readers” can read displayed texts to them and tell them what’s on the pictures.

Currently, it is not known whether Facebook uses the data, which is collected automatically via photos, also for advertising purposes. In any case, it is known that the social network uses AI to automatically detect and, if necessary, delete dangerous or prohibited contents. For this purpose, “billions of public photos” (also from Instagram) are used to train the systems.

Read also:

Facebook’s Libra: A cryptocurrency or a whole new fiscal system?

Go to:

Read full Article