News04/26/2023

3 Uses of Large Language Models in the Travel Industry

Julian Whiting

The travel industry has been one of the most dynamic sectors of the global economy, and with the rise of digital technology, the industry is once again evolving. Today, travelers expect personalized experiences that cater to their individual preferences and needs, without paying for expensive concierge services.

Large Language Models (LLMs) are artificial intelligence-powered systems that can analyze vast amounts of data to provide personalized recommendations and insights automatically at low costs (relative to a human concierge or travel agent, for example). In the travel industry, these models are being used to enhance the travel experience for travelers and streamline operations for travel companies. Many companies in the industry are starting to utilize LLMs to create new product offerings and reduce operating costs.

What are Large Language Models?

Large Language Models are a type of artificial intelligence that analyze vast amounts of text data to, when prompted with a specific question or task, generate a human-like response. These models use machine learning algorithms to learn the patterns and relationships in the data and use them to generate new content.

In the context of natural language processing (NLP), LLMs are trained on massive datasets of human language to understand the underlying structure and meaning of language. They can then use this understanding to generate text that is grammatically correct and semantically coherent, and in some cases, can mimic human-like responses.

One of the most well-known examples of a large language model is GPT-4 (aka Generative Pre-trained Transformer 4), developed by OpenAI. GPT-4 has been trained on a massive dataset of human language and can generate a wide range of outputs, including text, images, and even computer code. It has been used in a variety of applications, from language translation to chatbots and virtual assistants. Other examples of large language models include BERT, GPT-J, and StableLM.

Uses for LLMs in Travel, Tourism & Hospitality

Personalized Customer Service

One of the most significant advantages of LLMs is their ability to personalize customer service. Travel companies and hotels can use LLMs to analyze customer data and create tailored travel itineraries and recommendations based on individual preferences. By automating customer support, LLMs can also provide personalized responses to inquiries and requests, further enhancing the customer experience.

Large language models are particularly well suited for customer support due to their ability to capture fine-grained contextual cues in various languages. Despite having inherent word length restrictions, when designed intelligently, chatbots powered by the newest generation of LLMs are able to utilize both long term and short term memory through high performance vector data stores, enabling them to bypass this inherent limitation.

Targeted Advertising

LLMs can also be used for marketing and advertising efforts. By analyzing social media data and customer behavior, LLMs can provide insights into customer preferences and help companies create targeted advertising campaigns. This can improve marketing strategies and ultimately lead to increased sales and customer engagement.

Text embeddings created by LLMs can be used to create a numerical representation of text which encodes fine grained contextual information. By embedding a user’s search queries, product feedback, interests and preferences, embeddings for a specific user can be matched with relevant ads. For example, if a user is interested in fitness, they may be interested in a hotel that has an on-site gym or swimming pool.

Increased Efficiencies

Lastly, the travel, tourism, and hospitality industry can benefit from the use of LLMs to increase operational efficiency. Employee turnover is a significant issue in this industry, with a turnover rate of 4-5% per month in 2023, which is double the average rate in the private sector. High turnover can result in increased training costs, decreased productivity, and lower customer satisfaction.

LLMs can be utilized to develop or enhance training tools, providing an interface to construct training materials from corporate documents, resulting in faster onboarding and improved knowledge retention. Even if the training models are not found in the dataset the LLMs were trained on, they can be prompted to search an existing database to locate relevant documents matching the user's query, effectively expanding a language model’s knowledge base well beyond its training corpus.

Future developments involving ‘LLM Agents’ are anticipated to further improve accuracy by learning to harness the power of existing tools and APIs.

Considerations for Developing Your LLM for Travel and Tourism

When developing an LLM for the travel, tourism, and hospitality industry, there are several considerations to keep in mind. These include the size and quality of your training data, the complexity of the language model, and the infrastructure required to run the model.

Open Source vs. API

There are two primary approaches to developing an LLM: building in-house with open source models or utilizing private models through an API. Open source LLMs, such as GPT-J, are available for anyone to use and can be fine-tuned to specific use cases. APIs, such as those provided by OpenAI, provide pre-built LLMs that can be integrated into existing applications quickly.

Using a hosted API has the advantage of minimizing any overhead and maintenance costs associated with building your own infrastructure; however, for data governance and privacy concerns, it may be preferable to host your own model. If choosing the latter, the good news is that it’s becoming easier every day to host your own model as models are continuously optimized for lighter resource usage and serverless infrastructures are becoming easier to build and maintain.

Build vs. Buy

Companies must consider the costs and benefits of developing their own LLM versus purchasing an existing solution. Using a pre-built solution like OpenAI is a good option for quickly setting up a proof of concept without needing specialized machine learning personnel. However, this approach can be costly for high throughput applications, with a cost-per-token of $0.06/1k per prompt tokens and $0.12/1k per generated token for the newest GPT-4 API. Therefore, developers need to be mindful of the volume and frequency of data that must be processed with these APIs.

While models like GPT-4 perform well across a wide range of tasks, much of their knowledge may be unused on closed problems such as question answering over well-defined corporate documents. For simple search and product recommendations, open-source embedding models may provide comparable performance at a much lower cost. Smaller, open-source models offer benefits such as lower cost, increased product visibility, and better security when hosted internally.

Purchasing an existing solution, on the other hand, can be quicker to implement but may be less flexible and secure. Prebuilt APIs are a great decision for building a quick proof-of-concept, as they can be used effectively by a handful of developers who don’t necessarily need to be experts in natural language processing.

Work with Us

At Strong, we have the expertise to help your business leverage large language models for a variety of applications. Whether you have an abundance of data or none at all, an established product or a proof-of-concept, we can help you leverage state of the art NLP in a matter of weeks, not months. Contact us to learn more about how we can help.

Share:

Strong Analytics builds enterprise-grade data science, machine learning, and AI to power the next generation of products and solutions. Our team of full-stack data scientists and engineers accelerate innovation through their development expertise, scientific rigor, and deep knowledge of state-of-the-art techniques. We work with innovative organizations of all sizes, from startups to Fortune 500 companies. Come introduce yourself on Twitter or LinkedIn, or tell us about your data science needs.