EQengineered POV: How are You Approaching LLMs and AI as a Business? by Mark Hewitt
During my time at Forrester Research, I learned to embrace new technologies and trends through the lens of, “What value can this provide to customers, my team and myself?” Chasing the next shiny ball is not a strategy, nor is waiting until the market leaders set the pace and direction.
Rolling up one’s sleeves to become proficient and eventually master new skills and tools that have relevance, employing a disciplined approach to experimentation, and determining use cases that derive evidence-based value ensures results. These tenets are especially true as an individual, leader and organization in today’s data-fueled marketplace and environment.
I believe that we are at the peak of the generative AI hype curve, and I would argue still in the early stages of effective enterprise AI/ML/data engineering adoption. Why do I say this?
Generative AI’s value - and by that I am referring to large language models (LLMs) - is fairly simple to understand, tools are accessible and usable, and the barrier to experimentation is low, and inexpensive. For these reasons, it is mainstream and individuals and businesses should be exploring tools that can be leveraged, defining value-based use cases to explore, experimenting with the veracity of results, and accelerating learning to ensure enterprise knowledge and understanding increases.
AI/ML is complex and data scientist and data engineer skill sets are maturing. Individuals who can truly understand the business needs, liaise/communicate with the business and data engineering team to craft direction and solutions, and deliver the needed data engineering results are not prevalent.
In additional to resource shortages, enterprise organizations are in need of modern data architecture and an effective data and analytics operating model to evolve their data logistics and data engineering toward a more advanced predictive and prescriptive direction. These fundamental building blocks are lacking in many enterprises.
Lastly, there remains confusion at enterprise organizations about where the data engineering leader should reside. Is it a direct CEO report, or is it an operations/COO, CFO, CMO, or even engineering reporting role? There has been much attention paid to the organizational seat a CDO/CDAO or data engineering leader holds.
Recommendations | Take Aways
In my opinion, the Chief Data Officer or Chief Data Analytics Officer must be a direct CEO reporting position given the criticality of data to the enterprise moving forward.
As an enterprise organization, it is critical to think through the intellectual property implications of leveraging public tools and the value of implementing a private or custom LLM solution. When team skills proficiency is present and adequate data is available, a private LLM correctly deployed and trained will deliver security, ownership of results, precision and tailored outcomes specific to the organization, and its domain, style and/or domain language. And, if scaled correctly, a private LLM will yield improving and more valuable accuracy of results.
There are risks to the custom LLM direction including the required time and resource investment, and the likelihood of the solution becoming less versatile outside of the specific trained domain. Governance and an acute focus on the problem to be solved and data availability, readiness and cleanliness are paramount for an organization interested in building upon a foundational model. Through a disciplined, transparent, consistent, and regularly audited/validated data and model approach, an organization can avoid falling victim to model bias, risk, and a devolving accuracy of results. This is true for both custom LLMs and the broader enterprise AI/ML direction.
As an individual and as a business, personalization through LLMs and AI will continue to become increasingly prevalent in enhancing the customer experience. For example, marketers are utilizing data including browsing history, social media activity, and previous interactions with the enterprise to build AI algorithms that generate recommendations tailored specifically to the customer's interests and needs, and provide a seamless and efficient experience. This type of use case leads to increased customer satisfaction, brand loyalty, and increased revenue for the company. Find the use cases that derive the most ROI for your business and get started.
Support response time is also a common AI use case targeted with AI solutions. AI or LLMs can improve customer service by providing timely, relevant and personalized support. Again, inherent in these solutions must be a focus on governance to ensure the personalization minds data privacy and ethical use based on consent.
AI and generative AI solutions have great potential to revolutionize the way in which individuals and businesses communicate, interact, and address personal, customer, and experience needs. It is critical to get started by coming up to speed on mainstream tools, approaches, tactics and techniques. Generative AI has catalyzed the speed of change towards a data-driven personal and professional world. Public and free courses are available and prevalent. The time for inaction and a wait-and-see mindset has passed as the speed of the data revolution has changed considerably in 2023. The pace of innovation and progress will continue to accelerate in 2024, as we witness more breakthroughs and adoption.
If your organization needs help getting started with educating your team, discovering or assessing potential problems/impact, planning/creating roadmaps or implementing data engineering solutions, EQengineered is here to help as a strategic consulting partner on your enterprise journey.