LLM Boom: Winning in an era of AI-fulled innovation

Laura Balcazar

More than half of Y Combinator’s AI portfolio companies have launched in the past two years, following ChatGPT’s November 2022 debut1. The excitement is palpable, with Google mentioning ‘AI’ over 120 times in 120 minutes during their 2024 Google I/O keynote. 

But how much can LLMs really do? And how should you incorporate them into your processes – if at all? Do you build or buy? And how do you choose the right partner – especially when there seem to be so many options? 

In simple terms, Large Language Models (LLMs) are great at anything to do with language: reading, writing, summarizing, translating. Nowadays, they can also process multi-modal requests, meaning they can take in and produce images and sound - not just text. In the logistics world, this capability allows them to automate complex tasks such as reading documents, entering data, and providing customer support. 

The interesting thing is that while LLMs are a cutting-edge technology, they are also somewhat of a commodity. Leveraging their power is key to remaining competitive, yet the complexity lies not in accessing LLMs but in wielding them effectively. Winning in this new AI-era requires industry and technological expertise, speedy testing and implementation, and effectively embedding AI into existing tools and workflows. 

Where to start? 

Bloomberg predicts that generative AI will become a $1.3 trillion market by 20322.  Given the massive potential of these technologies it’s no surprise that countless companies have sprouted or rebranded as AI-enabled, AI-powered, or harnessing AI at their core. While this surge may cause marketing fatigue in business leaders looking to make effective business decisions, it’s clear LLMs have become a transformative yet accessible commodity:

  • They are widely available. OpenAI’s GPT, Google’s BERT, and Meta’s Llama are available globally to anyone. 
  • They are interchangeable. LLMs can be switched in and out depending on which is better suited for what task and the latest releases. 
  • They are getting cheaper. While LLMs can be enormously expensive to develop, with Google’s Gemini Ultra costing upwards of $191M, leveraging pre-trained LLMs can be a cost-effective option for businesses.

If LLMs are so powerful and available, what’s the catch? Maximizing AI’s potential is quite tricky, requiring expertise, experimentation, and resources. For logistics companies, striking gold during the LLM boom requires companies excel in three areas: 

  1. Up-training and fine-tuning foundation models with robust data and machine learning (ML) expertise.
    • A freight forwarder can upload and process an AP Invoice with ChatGPT, but try and do that 3,000 times, for tens of different partners, with thousands of document layouts, and updating all systems in less than a minute. That's where having the right partner is key.
    • Models thrive on data, so the companies with access to vast, proprietary datasets will deliver the best results, honing in on industry-specific use cases.
    • Teams must possess the ML expertise to fine-tune models, experimenting with different architectures on top of foundation models for optimal, task-specific outcomes. 
  2. Testing and implementing at lightning speed.
    • Having the best models is insufficient. In this rapidly evolving environment, companies must test new models across various use cases, measuring regressions, and rolling out where appropriate. Configurability and speed are crucial.
    • For this, companies need to have the right internal tooling in place, which is rare at a stage where most providers are focused on capturing the attention and business of their first customers rather than on optimization and scale. The latter is key to driving real value. 
  3. Connecting AI power to full stack ecosystems.
    • Finally, AI must operate within the context of an industry’s systems and workflows, offering enough value to justify workflow changes. Any workflow that is not fully agentic will yield the best results if users have high engagement and satisfaction. Even agentic workflows provide the most value when hooked to relevant sources of data like CRMs and ERPs.
    • For this, full-stack features and integrations are key. It is not enough to extract data from a PDF. At a minimum, a service should also standardize that data into the right format, validate the information, have an approval process in place for workflows that require a human-in-the-loop, and send the data to the TMS, maybe even updating customers and generating dashboards, all the while providing a delightful user experience and clear interface. 

Seeping out fool’s gold is certainly a challenge in these early days of proliferating AI companies, especially when there is the temptation to instead employ low-cost workforces or do nothing at all. 

The fact remains, however: AI is ushering in a new era of innovation and productivity similar to the internet revolution of the early 2000s, and the companies that do not start integrating AI into their processes risk getting left behind. Winning providers are building holistic, industry-specific, end-to-end solutions embedded with AI. They are not just accessing but skillfully wielding AIs power, keeping up with accelerating levels of innovation and change so you don’t have to.

1. Companies tagged with “AI” were considered “launched” based on their YC batch

2. Bloomberg

3. The Cost of Training AI

$1M annual savings & 2,000 extra hours a month await.

Explore how, on average, automating workflows for 3,000 shipments a month can lead to impressive annual savings. 
It all starts with a demo.

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.