We often celebrate certain advancements in technology that make our lives easier, but a glitch in Facebook’s app caused an uproar. The popular social media platform is doing its best to retain users after being accused of allowing false information to be shared, thus impacting elections. Facebook has cracked down on the spreading of “false news” and implemented strategies to make navigating their website and app easier, including its artificial intelligence that attempts to offer suggestions to users.

However, that AI landed the company in hot water after users complained that when they were watching a video featuring a Black man, it asked if they wanted to see additional “videos about primates.”

Carl Court / Staff / Getty Images

Facebook was quickly dubbed to be racist and soon, the suggestion feature was reportedly taken down. According to NPR, Facebook is investigating the issue and in the meantime, they have offered an apology to the public via The New York Times. A representative for Facebook called the issue “unacceptable” and added: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”

“This ‘keep seeing’ prompt is unacceptable, @Facebook,” Facebook employee Darci Groves recently tweeted. “And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious.”