Search landscape

What’s over the horizon for search?

The search landscape is becoming more fragmented with voice, image and AI gaining ground

Search today

In many ways, search has become an unconscious behaviour. For a long time now the address bar in our browsers has been integrated with a search box, making it irrelevant whether or not we know the URL of the website we are after – this “shortcut” is not something many people would even think of as a search. And with personal data utilised by search engines to qualify and personalise search results, the process has become so quick and frictionless it’s no surprise that people have a hard time recalling how many and what searches they have carried out over the period of a single day.

People have become increasingly reliant on their devices to provide them with information whenever and wherever they need it. Searching for answers to specific questions is the most popular type of online search, according to our study (see chart, right). We estimate over two-thirds of people carry out searches using their mobile, and with the shift from offline to online shopping, search is undoubtedly playing a much bigger part in product based decision-making: two-thirds of people we surveyed admit they do more searches before making decisions than they used to, rising to three-quarters of 18-to-34-year-olds. Product-specific searches now account for a fifth of all search activity, and from our proprietary decision-making research tool DX, we know that these behaviours can be seen across multiple sectors and products, not just the big ticket decisions. Increasing levels of exposure to technology have reset people’s expectations, impacting on every decision they make. Among tech-confident users in particular, search is more important than ever, as they seek out the results that will enable them to compare more items, find out more information, and explore more reviews and recommendations before they commit.

It’s worth noting that the way people think about search in the real world is very different from how it’s discussed in the industry. Rather than considering a single search as a single query, they talk about what we have defined as a “search moment” – one or a number of searches all relating to the same end goal (e.g. finding a product they want to buy). This makes the task of assessing the number of searches taking place a complicated one, but through our research we were able to estimate that people perform an average of 22.2 searches per day – bringing the volume of daily searches in the UK to just over 1 billion.

Share of search diagram

How people search is also more sophisticated than it’s ever been – they are searching deeper, wider and in new ways, which is transforming the search landscape, making it more varied and fragmented (see opposite page). While more than three-quarters of search activity is still done via text, the rise in smart speaker adoption has facilitated a growth in voice searches, which we can see becoming more mainstream in the near future. Visual search, on the other hand, remains quite niche, accounting for only a fraction of all searches. Most people are still unaware of the technology, and from our research we have found that there is a level of confusion around the concept of visual search – people who claim to have used it often talk about simply searching for images online. But as speech and image recognition become more accurate over time, and these new technologies are integrated into the devices we carry around with us every day, we expect these new ways of searching to disrupt our traditional habits, moving us further away from the search bar.

The search landscape is a melting pot of innovation and change. You only need to look at the transformative efforts of search giant Google over the past 20 years to see how far we have come; from desktop to mobile, text to voice (and visual), and now reactive to proactive (and assistive) search. Announcements at the Google I/O 2019 event around AR search, Google Lens, Google Duplex and Google Assistant are a further demonstration of how they are innovating for the future and what lies ahead for the search industry. When these kinds of developments in technology combine with changing consumer behaviour and a whole host of new places to search, a fast-moving but exciting time ahead for the industry is an inevitability.


Beyond Google: increasing diversity in the search landscape

With changing consumer behaviours, we will see search journeys evolve to include a wider variety of sites. People are now searching on platforms that are more focussed on one specific industry or type of content and they will go directly to these – take Netflix, YouTube, the travel search engine Skyscanner or online retailers Amazon or eBay for example. These types of sites are referred to as vertical search sites. We found that what is viewed by people as traditional Google search, still plays a dominant role in the search landscape, with more than two thirds of 18-to-34-year-olds surveyed admitting they couldn’t function without it. However, it was also interesting to see the change in how people structure their searches. Along with a growth in people visiting vertical search sites, we observed that traditional Google search is now happening at a wider variety of moments within the search journey, not just at the beginning of a search but also mid-way and for the completion of the final purchase. It is the sheer quantity as well as the variety of online content that people now have to navigate that is driving all of these changes in consumer behaviour.

People now know what they want and how to find it. In order to sift through all the information at their fingertips and arrive at the right content quickly, they are increasingly utilising the specialised nature of vertical search. Together, all of these behaviours mean that people are searching for more things than ever, in a wider variety of ways, and therefore we expect to see overall growth for the entire search market as a result.

Across a wide range of search topics, our research shows that a significant proportion of people say they would rather start their search journey on a specialist platform than on their search engine of choice, something that was especially the case within product-led categories (Fig.1). And it’s the brand-led searches that drive vertical behaviours: while just under two-thirds claim they would choose to look for groceries outside search engines, people still prefer them when it comes to finding recipes.


Historically, vertical search on websites has been characterised as occurring when people are close to purchase and are looking to finalise their choice. We asked our respondents to document the searches they made for their last online purchase and found that, across a range of different categories, vertical search is now playing much more of a role at earlier stages within the journey than was previously thought. While just under a fifth of people went to a search engine, around three-quarters reported starting their search at a specialist website or app. The dominance of Amazon in ecommerce has undeniably fed into this, as more than half of those we surveyed claimed they would go straight to Amazon when searching for a particular product. When it came to actual behaviours, we found this to be particularly the case for certain categories – more than half said that they started their search at Amazon when buying entertainment or children’s products (Fig. 2).


Through partnering with Captify (the global leader in search intelligence) we have been able to observe the growth in vertical searches across their network – premium publisher sites, vertical-specific sites, news and entertainment portals, price comparison sites and telco portals (Fig. 3).


While the broader categories such as finance and travel remain largely stable in growth, the most significant changes have occurred within the more specific categories such as makeup, accessories and women’s clothing.

The past few years have also seen search within social platforms develop rapidly, orientated towards exploratory searches related to a topic or product category of interest. Pinterest is a good example of this, and the rise of visual search capabilities has only served to strengthen this discovery role.

But just as Amazon and other verticals are playing more of a role across the early stages of the search journey, we can expect social platforms to facilitate more of those later stage, transactional searches, particularly within highly visual categories such as fashion, beauty and homeware.

Instagram’s recent announcement of an in-app checkout for its shoppable posts typifies the way in which social will be able to fulfil “full-funnel” searches more effectively. These changing roles of the different players are likely to make the part that vertical search plays within the search journey more varied than ever before.

Assistive search

Search has always been about matching content with a person’s need or intent. In the early days of search the only expression of that intent came through the keyword itself. Over time, a greater understanding of people and their context has improved both the search platform and the advertiser’s ability to understand someone’s intent.

The continued development and enrichment of audience targeting over the past couple of years has meant that the relative importance of the keyword for search marketers has begun to decline. Using an analysis of intent signals such as recent search queries, location and website browsing activity, advertisers are becoming less reliant on the keyword itself to indicate what need someone is trying to resolve. We estimate that approximately half of pay-per-click activity includes some form of audience targeting – typically a strategy of using various audience types layered with positive and negative keyword bids.

In the coming years, the application of machine learning to a wider range of data signals will lead to a much more sophisticated understanding of people’s search needs. As well as using direct expressions of intent (i.e. keywords or voice commands), machine learning algorithms will use data relating to physical context, social graphs and past behaviour to predict intentions.

This will have two key effects:

  1. From the user perspective, we will move towards a world where the search engine becomes increasingly assistive and proactive, often meeting our needs before they are expressed as a keyword search.
  2. For the advertiser, there will be an increasing shift in focus from keywords to intent and audience buying.

The rise of sensory search: moving beyond the search bar

The increased diversity that we have seen developing in the search landscape isn’t just limited to the screen. Over the next few years we expect to see people’s search behaviours move beyond computer and laptop screens, becoming less text dependent and much more sensory.

Since the introduction of Siri in 2011, voice as a search behaviour has taken some time for people to adapt to, owing to challenges such as recognition of different accents, quality of results, being comfortable talking to a machine and a general lack of understanding as to when voice might be more useful than text.

But in 2019, it seems that voice technology has really come of age, with people’s perceptions of voice shifting from a short-term, fun and gimmicky technology, to truly understanding the longer-term time-saving benefits of having voice search capabilities built into daily life. In fact, almost half of 18-to-34-year-old voice users can’t imagine going back to the days before they had a smart speaker. Visual search, on the other hand, is a much more natural and familiar search behaviour. From scrolling friends’ and influencers’ feeds on Instagram, to searching weddings or home renovations on Pinterest or even future love interests on Tinder, while search inputs haven’t traditionally been image-based, processing search output as images certainly has.

As a result, when introduced to the concept of visual search, people can immediately recognise the benefits of the technology and potential use cases, both in terms of changes to the way we search now and in opening up new possibilities in the way we search in the future.

But the shift to sensory search is not as simple as a Find and Replace for current search behaviours. As we start to move away from purely text-based search, these new ways of searching will be used in quite different ways, based on their strength in meeting people’s needs (Fig. 4).


Voice is best suited for specific, focused queries, with one objective answer. It’s the perfect platform for when a person has something very specific in mind, where the input and output are simple. While the use of voice search has mostly been limited to the household through smart speakers, we are increasingly seeing voice technology being embedded into other products and a rise in usage of voice search in other locations and moments in people’s lives, such as in the car and on the go, or through mobile phones and wireless headphones. Where voice works less well is when the query yields a complex or detailed answer. Explorational and conversational search methods are currently limited by neurolinguistic programming capabilities, as well as a lack of screen (particularly in the case of smart speakers). When the platform fails to deliver on its inherent benefits of ease and simplicity, people are quick to become frustrated, as the issues they’ve had with voice in the past have made people increasingly intolerant of the technology letting them down.

Where it does work, however, people are truly starting to appreciate the broader benefit of having access to voice search, particularly from the perspective of removing friction in their everyday lives.

While voice search moments are short and sharp, visual search is best suited for tasks that are focused on inspiration, exploration and discovery, in terms of both inputs and outputs: where either you are faced with a spontaneous search that’s hard to articulate, or the “correct” output is subjective, or both.

More than half of people surveyed agree that visual search is a good way to get inspiration before shopping or to discover new products and brands, and almost two-thirds believe visual search will make it easier to search online.

With this in mind, it’s likely that we will see voice and visual search becoming stronger and more useful to people at different stages of the consumer journey. While visual search is at the start of the journey as people seek inspiration and discover or explore new products and brands, voice will be much further along with specific product queries or details, and potentially even through to purchase.

It’s important, however, to ensure that we aren’t thinking about voice and visual search as siloed search channels. In reality, it’s likely that we will see voice and visual search used together with text in the future, with visual methods triggering searches, and voice then used to refine queries further down the line.

We expect to see significant growth in sensory search over the next three years as these new ways to search not only cannibalise existing text search moments, but also create new ones.


Figure 5 shows current perceptions of search moments where people think voice and visual search would work well. When compared to our more tech-savvy 18-to-34-year-olds (Fig. 6), we can see that acceptance of voice to address popular search moments, at least in the short term, is a much larger opportunity for brands than visual search. When it comes to search, while text will still do most of the heavy lifting in the foreseeable future, we believe voice will achieve greater scale than visual in the search space over the next three years, as it cannibalises common text search moments such as directions, specific questions, news and entertainment.


As people are so quick to accept the concept of visual search and how it may fit into their lives, visual search has the potential to become a victim of its own success. Current visual use cases are less common, relatively limited to identification of objects and focused on certain sectors that already lend themselves well to image-based searches (e.g. fashion, homeware, beauty). However, the search moments that people think (and expect) visual search to address go far beyond its current capabilities. People have visions of visual search being able to do things such as pulling together nutritional information and recipes for the food they currently have in the fridge, identifying issues and providing solutions for situations in front of them such as a sick-looking plant or a red wine stain on their carpet, or being able to search their living room and have visual search identify complementary home decor pieces.

Because images are computationally harder to process than words, and intent in visual search is much harder to determine, the technology simply cannot currently meet people’s expectations of what they want it to deliver. If visual search is pushed onto people before it is ready to deliver on its promises, there is a danger that this will halt uptake and, much like the introduction of Siri, delay adoption of this technology for far longer than if higher performing technology was to be introduced at a later date.


Read more from Atticus Journal Volume 25
This is an excerpt from Future of search 19.1 MB

Sophie Harding, Josie Ung and Ksenia Kharkina


published on

10 November 2020


The Atticus Journal Communications

Related Topics

Consumer behaviour

More in The Atticus Journal

Allison Spray

Generative AI: mitigating risk to unlock opportunity

H+K’s Allison Spray on managing the commercial and reputational risks that the proliferation of generative AI will present

Illustration of windmill on a backdrop of green cars

Making sustainability profitable

Sustainability investments must deliver returns – both financial and reputational – to be ‘sustainable’ for business. Something needs to change, says Luc Speisser

Sustainability comms must get real

Sustainability comms must get real

There’s a disconnect between the way corporations talk about climate change and how the public discusses the same issue. That’s the conclusion of research by Jamie Hamill, Alessia Calcabrini and Alex Kibblewhite.