Filter By

Show All
X

Connect to

X

DMA North: Artificial Engagement - Manchester Roundtable

T-image0011.png

Industry players participated in a roundtable discussion in Manchester to explore what artificial engagement is, how to quantify the size of it, and who is accountable for delivering a solution.

Mark Greenwood, Chief Technical Architect and Head of Data Science at Netacea, spoke about bot traffic: which forms they take, their threat to businesses, and how to spot and stop them.

There’re bots with good intentions, such as time checkers, and bots with malicious aims, such as account takeover – and lots of grey areas. For instance, scraping is good when coming from search engine bots, but it’s also potentially a means to steal content. Regardless of the intent and outcome, they can’t be avoided: 53% of web traffic is bots, 33% of them malicious.

Understanding the nature and scale of the above problem has become non-negotiable, so what are the steps we can take?

Awareness

It was agreed that not everyone will want to become aware or address the issue – when the numbers are working in your favour, why would you want to undermine them?

However, it was also noted that most people probably don’t realise the extent of ‘fake’ engagement. Awareness isn’t just critical amongst marketers – it’s essential all the way up the chain. Unless all stakeholders are informed, driving change will likely only be met with resistance.

Accountability

Who is answerable to the problem of artificial data? How we do actually ensure that they take such responsibility?

“We as agencies have to put accountability where it belongs,” said Sara Simone, Digital Oracles. “The publishers at the top!”

Do platforms like Google and Facebook see themselves as platforms or publishers? Do agencies really have a leg to stand on in demanding answers from these big players?

Greenwood commented that while there’s ample interest in the issue of artificial engagement, the high costs associated with it mean people are disinclined to actually take ownership.

Some argued that we all have to be answerable: “It comes full circle,” said Lucy Nolan, Dot Digital. “You have ad networks and all these major companies, and you could argue it’s their problem to solve. But we’re all feeding into it.”

Reliability

The discussion moved on to the issue of data reliability. Knowing what we do now, can you fake data at face value? And how do we get around the problem to make informed decisions and accurately report on performance?

The consensus was that the focus should be shifted on to the quality of information. To combat the vanity metrics that stand in the way of tackling artificial engagement, as well as improve the reliability of data sampling, education is needed.

Concerns were also expressed about clients being reassured with what information is reliable. A suggestion to get around this was to look at things like the behavioural journey, rather than web traffic.

Intelligence

How can we futureproof against artificial engagement from an industry, client, and agency perspective? Bots are getting smarter in their efforts to fake identities, while real consumers are increasingly opting to hide their identities, further blurring the lines between genuine and artificial engagement.

It was suggested that from the initial beginnings of the design process, we should be thinking about a real user’s intent and how exactly they behave.

The roundtable turned to the critical nature of education, not just for marketers, but end-to-end from C suite to consumers. They need to know why you want to identify them, and how it will improve their journey. Otherwise, they’ll be more likely to reject your request for their data.

The Future for Digital Engagement

There could be a line drawn under artificial engagement, but there’s also the concern of the sheer scale of the issue. The ultimate consensus is that awareness and education are key.


To become involved in further discussions that capture research and recommendations on artificial engagement, take our online engagement survey

You can also share your experiences with Anna Lancashire, DMA North Community Manager, at anna.lancashire@dma.org.uk

Hear more from the DMA

Please login to comment.

Comments