Blog
Peter Fitzgerald

Do you ever feel like it’s harder than it used to be to get the answers you’re looking for online? You’re not alone. When scrolling through endless streams of meandering articles, it’s easy to wonder if any of it really serves the user any more.

In this article, we delve into the pressing question: Does content need to be more personal to thrive in 2024? Follow along to find out how content must evolve to keep up with changes in user search behaviours.

Making content more human

I was reading through the Content Marketing Institute’s take on the key trends in content marketing for 2024, and while there is all the stuff you’d expect to see in there (AI, AI, and more AI), there was something else that stood out to me.

There’s a pattern within these predictions about the need for a more genuine, human element to content in the coming year. Examples of some of these prediction titles include:

  • “Sincerity rises above the algorithms”
  • “The secret sauce is people”
  • “Authentic voices resonate”
  • “People still want to connect with other people”

There is also this quote:

“Nothing replaces the raw, authentic emotion of human-generated content.”

Reading through these predictions left me with a palpable sense that something needs to happen, something needs to change. But when I spent some time thinking about it, it left me with a question I struggled to answer: What does this actually mean?

Re-visiting content shock

Back in 2014, Mark Schaefer wrote an article called Content Shock: Why Content Marketing is not a Sustainable Strategy – it was considered quite an incendiary idea that the push towards content marketing would create diminishing returns.

At the time, content marketing was seen as the answer to the increasingly paid-focused direction of platforms like Facebook, and Mark’s suggestion that this wasn’t sustainable was something many in the digital world didn’t want to hear. Looking back at the article now, Mark’s predictions are shockingly prescient.

If anything, I would say Mark’s predictions don’t go far enough. The reality we find ourselves in goes beyond a simple issue of volume against capacity – it’s certainly true there is too much content for consumers to evaluate and respond to. We see that across media, from streaming platforms, to gaming, to sports – with the Economist publishing an article in August 2023 called ‘Is there too much football?’ (to my mind, the answer to that question is always yes).

We’re way past the point of no return on volume – but I think content shock in 2024 has taken on a different shape. It’s not just about the volume that’s being produced, but about the nature of that content, and the way the texture of it has been shaped by the algorithms that decide what we’re shown.

A shock to trust

So how does this relate to those predictions about the need for more human, genuine, personal content?

Well, I think ChatGPT is a great case study for what’s going on in the world of content more broadly.

When the scope of what GenAI could do really started to be understood and explored in the middle of last year, the immediate question was what will this do to online content? And how will search engines and social networks respond to AI-generated content?

I remember having conversations about the idea that Google was going to algorithmically identify AI-generated content – the main point of discussion being how Google could know that content had been generated by a GenAI.

Fast forward to today and that idea seems less strange. Having spent time using ChatGPT, and also seeing the work it creates, I’m now sometimes able to identify content that has been generated by an LLM. I have similarly heard colleagues identify content that has been produced by AI.

If our human brains, without the depth of computing power Google can bring to bear on the problem, can spot trends that tell us (sometimes unconsciously) content has been generated by AI, then we shouldn’t be surprised Google can do this systematically.

Human brains are very good at spotting patterns and using those patterns to inform how we feel about things. I have now had many occasions where I have been reading things online and have experienced a sense that I am reading something written by ChatGPT.

While Google’s stance on AI-generated content has softened since those early conversations, the important thing to take away from that idea is that copy written by AI is identifiable, both by algorithms and by good old-fashioned humans.

So, the situation we find ourselves in now is therefore more extreme than the one originally posited in Mark’s article. Not only is the output of content vastly outstripping the consumption demand, but consumers then also have to question whether the content that does reach them was even written by a human.

Particularly as issues around the factual accuracy of content produced by LLMs continue, consumers will associate those intrinsic patterns (whether they know they are recognising AI-generated content or not) with that unreliability – meaning you have added a swathe of content that is inherently less trustworthy to the pre-existing over-saturation issue.

Is human-generated content the answer?

If AI-generated content risks alienating consumers, then the answer is to ensure all your content is written in a meaningful, personal, insightful manner by talented human beings. Simple as that, right?

Unfortunately, when we look at how users consume even the ‘best’ content, we see that consumers are making decisions about the quality of information being presented to them that aren’t positive signs for content marketers.

What we’ve done to search

Search is one of the most influential arenas when it comes to content – and not only because it dictates which content gets traffic.

Thanks to SEO, search has become a self-fulfilling prophecy. It’s no longer the case that content that best matches the parameters achieves a good ranking – instead, content is specifically written to fit those parameters.

Due to the parameters that influence those rankings (particularly once we start looking at the impact of semantic search on the way groups of relevant topics and search terms are aggregated), the kinds of content you’ll find in search all tend to trend in a certain direction.

Content has to be of a certain length and density. Content has to be ‘rich’. Content has to contain semantically related phrases and ideas.

This is all great if your user is looking into a topic that warrants that level of rigour and detail. But when your user is just looking for an answer to a simple question, not so much.

We can see this in the common ‘hack’ of adding ‘Reddit’ to the end of a search to get a better answer to your question. Back in February 2022, Android Authority found that 70% of a sample of 2,400 users have used this hack.

So, why do users do this? Google is the best way to get the answers you need, right? The best content rises to the top, and Google prioritises good user experiences, right?

This behaviour suggests that not all is right in the world of search. “Just Google it”… doesn’t ring as true as it used to.

Going straight to the source

To dig into this, I searched ‘why do users put Reddit at the end of Google searches’ on Google (which felt a little like dividing by zero).

My SERP looks like this:

The first thing that jumped out to me was the age of some of these threads. The second thing was that Google thought the best answers to this question were the Reddit threads discussing it (which added to the feeling of the internet eating itself in front of me).

The first thing we can therefore say is that this isn’t a new idea (it’s at least 5 years old), nor is it an out-of-date one, based on the research from Android Authority.

Looking at the thread ‘Does anyone else always add “Reddit” at the end when Googling stuff because you get good answers?’, a few specific comments jumped out at me.

These threads and comments—which vary in age but mostly happened in the last few years—speak to a percentage of users that have a preconception of the kinds of content search is going to serve them. There’s the frustration with ads you might expect, but it’s more interesting to see similar frustration aimed at organic search content.

On their page How results are automatically generated, Google states that the quality of content is an important factor in which content ranks well. They state that:

“After identifying relevant content, our systems aim to prioritise those that seem most helpful. To do this, they identify signals that can help determine which content demonstrates expertise, authoritativeness and trustworthiness.”

When we look at the motivations behind appending Reddit to those searches, we see indicators that Google’s aim and the reality of the user experience aren’t lining up.

Squaring the circle

And finally, we get back to the original prediction – that content needs to be more human, more genuine, and that the secret sauce is people, not algorithms.

I think that’s certainly true from a user experience perspective. The ‘Reddit search hack’ concept is just a demonstration of users finding ways around what they perceive to be the failings of search to serve their needs.

They go to Reddit for content written by real people, about the real thing they’ve asked about, to get advice or information or ideas from people who’ve actually done it, seen it, or lived it. The perception is that your average Google search result doesn’t provide that. It provides something built to rank for search, rather than something human.

This feels like an almost inevitable outcome of Mark Schaefer’s original thesis. The competition around content, particularly in search, is extremely high. The sheer volume of content in contention is hard to comprehend – on the page How Google Search organises information, it’s stated that “The Google Search index contains hundreds of billions of web pages and is well over 100,000,000 gigabytes in size.”

It therefore makes sense that the content written to best compete in that space, rather than written to best serve the user, is the content that rises to the top.

The issue is that those two things are meant to be aligned, but in reality, users don’t always find this to be the case.

 So where does that leave us?

  • The content in contention for our attention vastly outstrips our ability to consume it.
  • GenAI hands creators (both genuine creators and bad actors) the ability to further expand that output.
  • Content marketing experts recognise the need to further emphasise the human connection between writer and reader.
  • Users perceive that web content won’t serve that need, and is written to serve algorithms and ad serving rather than their needs.
  • Users find workarounds to circumvent the ‘best’ results from search and take them straight to real human conversation.

Ultimately, the predictions from the Content Marketing Institute article are accurate, though perhaps not as predictions of what will happen in 2024, but rather as a response to what is happening right now.

What users want is not served by the relevance death spiral that search and content find themselves caught in. And so while it’s true that users want more ‘human’ content, I question whether brands can really respond to that need within the systems we currently have to put that content in the hands of those users.

Where do we go from here?

The original article on content shock felt like someone had announced the end of the world for content marketing – but just as that wasn’t the end of the road for content, none of this will be fatal either.

One prediction would be a refactoring of organic search results to better align the results users are seeing with Google’s mission statement to provide the best answers for users. Google relies on users seeing it as the best (or at least easiest to use) source of information online, so they are incentivised to not let content shock damage the quality of the results served.

However, the alternative would be that the traditional search experience is left as is, with Search Generative Experience (SGE) positioned as the solution that cuts through the noise, though the efficacy of this will depend on what type of content the market is incentivised to create in order to appear.

I have heard discussions that take this a step further – a version of search that cuts out the SERP entirely. Imagine a search experience not dissimilar to giving prompts to ChatGPT. You ask the platform your questions, and rather than giving you a set of search results, it gives you an aggregated answer. If you didn’t get the information you needed, you either ask a follow-up question or ask it what else it can find.

We are already seeing the first steps in this direction with recent trials by Google, where GenAI is being used to create aggregated overviews in search results – this was previously trialled in the US, and the trial is currently live in the UK.

Bing Copilot also provides a version of this experience, which will likely only continue to develop and improve over the coming months.

What does content or SEO look like in this new environment? It’s hard to say. With the pace of development, a full adoption of solution like this may not be that far away (though how desirable this kind of platform would be in the long term to a provider like Google is questionable, as the question of how to sell advertising space looms large).

For now, making sure your content is human, genuine, authentic and insightful are all the right things to do – but that’s nothing new.

But doing so with a knowledge of the multiple factors that are both impacting the audience and driving future change will help you stay on the front foot and ready for the future of content. It’s clear that mastering the evolution of search and content strategies is crucial for success.

From humanising your content to optimising it for the latest search engine algorithms, we provide comprehensive solutions tailored to your needs. Contact us to explore how we can help you excel in both realms, ensuring a holistic and effective approach to digital content.

More on this subject