Some potentially harmful hashtags were not accompanied by warnings, with campaigners claiming it poses a “health risk” for users.

Image: Instagram is a popular smartphone app used for sharing photos and videos

Instagram says it is working to improve the artificial intelligence it uses to protect vulnerable users after communities promoting eating disorders were not picked up on the app.

Since 2016, the social media giant has held a policy that whenever a user searches for potentially harmful topics like “#proanorexia”, a pop-up comes up as a warning and points them to where they can get help before they can access the content.

As per the platform’s own rules, these pop-ups should accompany any search terms related to the sensitive topics.

However, Sky News discovered almost a dozen hashtags that are freely open with no warning while seemingly promoting unhealthy and dangerous attitudes towards food and body image.

The hashtag search terms were slight variations or different spellings on others that have been flagged up.

Instagram added the warnings within a few hours of being made aware and said the machine learning used for the past six months to root out these terms and flag them up is still a work in progress.

Image: Instagram is owned by the social media giant Facebook

Former bulimic Daniel Magson, who is now vice chair of Anorexia & Bulimia Care charity, said social networks like Instagram need to do more to educate and protect users.

He said: “It is incredibly dangerous and a real health risk.

“It’s not a safe space at all and these communities are promoting things like ‘these are the best places to dine with private toilets for afterwards’.

“They promote the best ways to injure or self-harm and that should not be allowed.”

Instagram says it keeps the hashtag topics live unless they frequently promote dangerous practices.

This is done on the advice of charities they are partnered with who aim to intervene when necessary.

Image: Daniel Magson, vice chair of Anorexia & Bulimia Care charity, said social networks need to do more to protect users. Pic: Twitter

The social network recently announced it is doubling the number of people working across safety and security teams for Facebook and Instagram to 20,000 by the end of 2018.

This includes a team of 7,500 content reviewers.

The company, which is owned by Facebook, said in a statement: “We care deeply about making Instagram a place where people feel empowered, inspired and comfortable to express themselves.

“Every day, millions of people use Instagram to strengthen relationships with friends and build communities of support, particularly around body image.

“Instagram was created to foster a safe, kind and supportive community and we’re committed keeping it so.”

A study last year by the Royal Society for Public Health suggested that of the big five social networks (Facebook, Instagram, Snapchat, Twitter and YouTube), Instagram has the most negative impact on users.

The news comes as tens of thousands of people across the world are predicted to be taking the month off social media as part of the Scroll Free September campaign.

:: On Sky News throughout September we’ll be following a group of six people of all ages to see how they get on in their quest to go full “Cold Turkey”. Through video diaries from our volunteers and expert analysis and industry interviews, we’ll be exploring the positives and negatives of social media.