The Google-owned platform said it took down more than 111,000 videos it classed as hateful or abusive between April and June, up from just under 20,000 in the previous three months.
Channels posting such content also experienced a hit, with 17,800 banned compared to little over 3,300 before.
Meanwhile, hateful comments deleted from the site doubled to over half a million.
YouTube said the spike was in part due to its tougher hate speech rules, which prohibits any video suggesting a group is superior in a discriminatory, segregational or exclusional way, based on qualities like gender, race or religion.
The policy also removes content that denies any well-documented violent events, such as the Holocaust or the shooting at Sandy Hook Elementary.
Across all take-down categories, YouTube axed over 721,000 more videos than it did between January and March, making the latest total 9,015,566 – the vast majority of which were spotted by automated flagging.
YouTube said 81.5% of these were wiped from the service before anyone got to view them.
However, hateful and abusive content still only makes up a small percentage of video removals, with subjects deemed spam, misleading or a scam standing at 66.8%, just under six million videos.
The company has long-grappled with balancing hate speech from free speech.
Chief executive Susan Wojcicki recently admitted that it “sometimes means leaving up content that is outside the mainstream, controversial or even offensive”.
“I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views,” she said last week.
YouTube is currently testing changes to the way videos are recommended to users in the UK to prevent borderline content and misinformation from spreading.