1 in 4 UK businesses worried about EU’s Digital Services Act 

News
Share


New research from AI-powered content moderation company, Besedo, reveals a quarter of businesses specialising in the IT/telecoms sector are concerned about the cost of adhering to the EU’s proposal for the Digital Services Act (DSA).

With many businesses under financial pressure following the impact of the pandemic, it will add another responsibility for businesses to adhere to. For example, Apple reportedly removed over one million listings for counterfeit and fake products in 2020.

The DSA will require online platforms to take more responsibility for dealing with harmful or illegal content and dangerous or counterfeit products (such as a mechanism for users to flag such content and for platforms to cooperate with “trusted flaggers”). 

Topping the list of main concerns for businesses around the DSA is lack of understanding on how to comply with the regulations (25%), whilst a further quarter of respondents have concerns regarding the cost of compliance. The research also reveals that reputational damage from not complying is causing apprehension, with 22% stating this as their biggest concern about the DSA. 

The research shows an awareness gap particularly between large and small businesses, with 87% of big business considering the DSA to be “a wide-spanning act to create a safer digital space, affecting multiple platforms.” However, this is true of just 40% of small businesses, suggesting that small businesses aren’t fully cognisant of the fact that the DSA will affect them in serious ways. 

Nearly half of businesses are planning on using staff to moderate content manually and manage the process themselves in order to comply. Over a third plan to build their own content moderation tool, with a further 22% saying they will look to set up their own Artificial Intelligence to moderate their content.   

Says Petter Nylander, CEO of Besedo:

“With the pandemic driving more consumers to use online platforms to shop, date and connect in a socially distanced world, the opportunity for fraudulent, harmful and upsetting content has increased.

“There’s no hiding from the fact this regulation will significantly impact how businesses operate online. The DSA will force businesses to change the way they approach content moderation to protect users against dangerous and fraudulent activity.”  

Nylander adds: 

“Businesses should start working on improvements now, to show customers that they are working with companies they can trust. Businesses cannot afford to take a cavalier attitude towards removing harmful content, not only as regulators crack down, but as users’ expectations of services rise.

“Effective content moderation, using a combination of AI and human moderators, ensures businesses can safeguard themselves and avoid reputational damage, as well as grow their business based on positive user experiences.” 

 

 

Chris Price
For latest tech stories go to TechDigest.tv