The Facebook-owned platform has come under increased scrutiny following a number self-harm incidents.
Karina Newton, head of public policy, and Vishal Shah, head of product at Instagram, were asked about the efforts it had taken to better protect people’s mental health – particularly the young – in an evidence session with the Digital, Culture, Media and Sport Committee.
Damian Collins, committee chairman, cited previous concerns about body image and filters on apps like Instagram and Snapchat which can digitally make skin appear smoother or lips fuller.
“It is something that we are looking at closely, there are a few important considerations for us when we were launching filters,” Mr Shah explained.
“Filters on Instagram are much more recent on our platform than our other platforms, one was that it had to be opt-in, people had to choose to apply a filter, and also that it was not restricted to filters that can alter appearance but things like putting on a hat or changing the world around you.
“I think that body image is a really important topic broadly speaking, not just on Instagram, and this is something that we are taking really seriously, especially in some of the research we are doing.
“I think it’s an important point considering the broad use cases whether we should be taking a harder look at those filters.”
Instagram recently confirmed that it will start a test in Canada to hide photo likes in a bid to reduce popularity pressure on users.
In February, the company announced a ban on graphic images of self-harm and the removal of non-graphic images of self-harm from searches, hashtags, and the explore tab, following criticism from Health Secretary Matt Hancock who said social media companies “need to do more” to curb their impact on teenagers’ mental health.