Microsoft has decided to thoroughly review its ethical regulations on the use of artificial intelligence, as indicated on its official blog, with a view to defining a real framework at the base of a responsible development of AI systems. Within this decision, the company has also opted for the withdrawal of several face analysis tools based on artificial intelligence, including one, in particular, able to identify the emotions of a subject through the analysis of videos and images. In particular, experts have criticized the tools for recognizing emotions, judging the comparison of external manifestations with respect to internal motive states to be unscientific, emphasizing how facial expressions vary according to the populations and cultures of reference. Therefore, Microsoft has decided to restrict access to certain functions, as stated in the transparency notes it publishes. In particular, the decision concerns Azure Face services for facial recognition, while other services will be removed altogether. The decision will not affect services considered harmless or useful, such as automatic blurring of faces in images and videos for privacy reasons. The latter services will remain openly available.
In particular, a somewhat controversial feature of Azure Face is removed, namely that related to the recognition of attributes such as gender, age, smile, beard / mustache, hair and makeup. Microsoft has not been offering these features to its customers since June 21, and existing customers will be revoked access on June 30 of next year. While public access has been revoked and the process will continue until 2023, Microsoft will continue to use some of these features in some products, including Seeing AI, an app that uses automatic vision to describe what surrounds visually impaired and blind people. It is not the first time that an AI tool to recognize emotions has been at the center of controversy, remember, for example, the project of Intel and Classroom Technologies for the recognition of the emotional states of students, whose critical issues have also been highlighted by Protocol. We refer you to the article in question for further details.