Search engines are used very differently from how they were just ten years ago. Search products have had a rich and vibrant diversity in terms of features, use cases, evolution and audience. There is more data to be sorted through, and greater expectation to get to the most relevant data points faster. User behaviour is gradually shifting away from sorting through several search results manually to get to the information they need, to getting answers up front along with references.
Evolution
Learning user intent
The latest buzz in search evolution is semantic search. Pandu Nayak, VP, Search at Google, aptly describes Search as being about understanding language:
https://www.blog.google/products/search/search-language-understanding-bert/. The same post mentions the technique that Google uses to achieve this - BERT. The term semantic search encapsulates the understanding in the search tech world today, of the need to understand user intent; going beyond simplistic keyword matching, as was the norm the past.
Knowledge graphs
A cursory glance at SERPs (search engine result pages) of most search engines would be a powerful evidence of this, with a knowledge graph being displayed by popular search engines on the right side of the screen. Knowledge graphs are another popular technique that big companies such as Google, Amazon and Microsoft have employed for serving user intent without significant user effort. In fact most users would be unaware of having interacted with knowledge graphs. Alexa, for example, performs very well at answering trivia questions from around the world, with knowledge graphs playing the key role behind the scenes.
Conversation
With the predominance of semantic search today, the product's shape and form has changed for the user as well. We see increasing user interaction with conversational interfaces, including chatbots, for making transactional search queries, such as checking weather, getting data point information and generally any query that has a unique high relevance result. From a user experience perspective, the question and answer paradigm feels more natural and intuitive for the user if implemented intelligently.
Accessibility
Voice input, multi lingual interfaces and data, screen readers and haptic feedback, while useful innovations, still fall short of making tech, including search, accessible and inclusive as much as we'd like. But the problems go beyond technology. We need to also evolve business models locally and create ecosystems for them, that would give incentive to both enterprises and users to adopt these products locally and diversely. More on this in a future post.
Future
While we have seen breathtaking advancements in search tech, we're still far away from a near human semantically proficient AI. One reason for this is the massive entry barrier in this field which limits independent innovation. Most of the tech mentioned here has been developed by big shops which can afford to run TPU and GPU clusters en masse on large amounts of data without breaking sweat. The "BERT large" implementation for example, learns 340 million parameters, would take 4 days to learn on 16 TPUs if not optimised.
Open sourcing of BERT as well as release of pre-trained models by Google, is an example of efforts by large companies to drive independent innovation in application of the underlying science. There are also efforts to be more prudent while machine learning models, to be more efficient and cater them to specific use cases as far as possible.
There are several areas where both businesses and customers have need and expectation from semantic search to get to the most relevant information faster, ideally in a question and answer format. Use cases include e-commerce, academic research and organisational data mining to aid decision making. We can expect applications in these areas to proliferate in the coming years along with exploration of new and hybrid use cases where the journey begins at search, but doesn't end there.
Comments
Post a Comment