Digital transformation must be a core organizational competency. That\u2019s my key advice to CIOs and IT leaders.\n\nDuring keynotes and discussions with CIOs, I remind everyone how strategic priorities evolve significantly every two years or less, from growth in 2018, to pandemic and remote work in 2020, to hybrid work and financial constraints in 2022.\n\nThe impact of generative AIs, including ChatGPT and other large language models (LLMs), will be a significant transformation driver heading into 2024.\n\nAs many CIOs prepare their 2024 budgets and digital transformation priorities, developing a strategy that seeks opportunities to evolve business models, targets near-term operational impacts, prioritizes where employees should experiment, and defines AI-related risk-mitigating plans is imperative.\n\nBut with all the excitement and hype, it\u2019s easy for employees to invest time in AI tools that compromise confidential data or for managers to select shadow AI tools that haven\u2019t been through security, data governance, and other vendor compliance reviews. The bigger challenge is to define a realistic strategy and develop a response to the \u201cimpossible dreamer,\u201d a business leader who \u201casks for the moon\u201d and is one of the business execs you\u2019ll meet in hell.\n\nAbhijit Mazumder, CIO of Tata Consultancy Services, says, \u201cTransformation priorities should fundamentally link to business priorities and what any respective organization is trying to achieve. In most companies, the leadership concentrates equally on growth and operational efficiency while not forgetting to prioritize resiliency, cybersecurity, and technology debt elimination programs.\u201d\n\nBelow are several generative AI drivers for CIOs to consider when evolving their digital transformation priorities.\n\nDefine a game-changing LLM strategy\n\nAt a recent Coffee with Digital Trailblazers I hosted, we discussed how generative AI and LLMs will impact every industry. Examples include how generative AI will:\n\n\u201cThis is a time for CIOs and CTOs to not only be creative on how they do more with less but also leapfrog the competition with calculated investments as their competitors are likely looking to delay or cut their own transformation projects,\u201d says Jeremiah Stone, CTO at SnapLogic. \u201cPrioritize transformation initiatives that can either create new revenue streams, democratize technologies, or reduce technical debt, especially when considering generative AI opportunities.\u201d\n\nCIOs will probably recognize that transformation programs of this magnitude are multiyear programs requiring evaluating LLM capabilities, experimenting, and finding minimally viable and adequately safe customer offerings. But not having a strategy may lead to disruption, and a key mistake IT leaders can make when attending board meetings is not having a plan for a world-changing emerging technology like generative AI.\n\nClean and prep your data for private LLMs\n\nGenerative AI capabilities will increase the importance and value of an enterprise\u2019s unstructured data, including documents, videos, and content stored in learning management systems. Even if the enterprise isn\u2019t ready to consider how generative AI may disrupt their industries and businesses, proactive transformation leaders will take steps to centralize, cleanse, and prep unstructured data for use in LLMs. \n\n\u201cWith users across the organization clamoring to leverage generative AI capabilities as part of their daily activities, priority No. 1 for CIOs, CTOs, and CDOs is to enable secure, scalable access to a growing range of generative AI models and enable data science teams to develop and operationalize fine-tuned LLMs tailored for the organization\u2019s data and use cases,\u201d says Kjell Carlsson, head of data science strategy and evangelism at Domino.\n\nThere are already 14 LLMs that aren\u2019t ChatGPT, and if you have large data sets, you can customize a proprietary LLM using platforms such as Databricks Dolly, Meta Llama, and OpenAI \u2014 or build your own LLM from scratch.\n\nCustomizing and developing LLMs requires a strong business case, technical expertise, and funding. Peter Pezaris, chief design and strategy officer at New Relic, says, \u201cTraining large language models can be expensive, and the outputs haven\u2019t been perfected, so leaders should prioritize investing in solutions that help monitor usage costs and improve the quality of query results.\u201d\n\nSeek efficiencies by improving customer support\n\nMcKinsey estimated back in 2020 that AI could deliver $1 trillion in value each year, with customer support a significant opportunity. This opportunity is greater today because of generative AI, especially when CIOs centralize unstructured data in an LLM and enable service agents to ask and answer customers\u2019 questions.\n\nJustin Rodenbostel, EVP at SPR, says, \u201cSearch for opportunities to leverage GPT-4 and LLM for optimizing activities like customer support, especially regarding automating tasks and analyzing large quantities of unstructured data.\u201d\n\nImproving customer support is a quick win for delivering short-term ROI from LLMs and AI search capabilities. LLMs require centralizing an enterprise\u2019s unstructured data, including data embedded in CRMs, file systems, and other SaaS tools. Once IT centralizes this data and implements a private LLM, other opportunities include improving sales lead conversion and HR onboarding processes.\n\n\u201cCompanies have been stuffing data into SharePoint and other systems for decades,\u201d says Gordon Allott, president and CEO of GetK3. \u201cIt might actually be worth something by cleaning it up and using an LLM.\u201d\n\nMitigate risks by communicating an LLM governance model\n\n The generative AI landscape has more than 100 tools covering test, image, video, code, speech, and other categories. What stops employees from trying a tool and pasting proprietary or other confidential information into their prompts? \n\nRodenbostel suggests, \u201cLeaders must ensure their teams only use these tools in approved, appropriate ways by researching and creating an acceptable use policy.\u201d\n\nThere are three departments where CIOs must partner with their CHROs and CISOs in communicating policy and creating a governance model that supports smart experimentation. First, CIOs should evaluate how ChatGPT and other generative AIs impact coding and software development. IT must lead by example on where and how to experiment and when not to use a tool or proprietary data set.\n\nMarketing is the second area to focus on, where marketers can use ChatGPT and other generative AIs in content creation, lead generation, email marketing, and over ten common marketing practices. With more than 11,000 marketing technology solutions available today, there are plenty of opportunities to experiment and make inadvertent mistakes in testing SaaS with new LLM capabilities.\n\nCIOs of leading organizations are creating a registry to onboard new generative AI use cases, define a process for reviewing methodologies, and centralize capturing the impact of AI experiments. \n\nRe-evaluate decision-making processes and authorities\n\nOne important area to consider is how generative AI will impact decision-making processes and the future of work.\n\nOver the past decade, many businesses have aimed to become data-driven organizations by democratizing access to data, training more businesspeople on citizen data science, and instilling proactive data governance practices. Generative AI unleashes new capabilities, enabling leaders to prompt and get quick answers, but timeliness, accuracy, and bias are key issues for many LLMs.\n\n\u201cKeeping humans at the center of AI and establishing robust frameworks for data usage and model interpretability will go a long way in mitigating bias within these models and ensuring all AI outputs are ethical and responsible,\u201d says Erik Voight, VP of enterprise solutions of Appen. \u201cThe reality is that AI models are no replacement for humans when it comes to critical decision-making and should be used to supplement these processes, not take them over entirely.\u201d\n\nCIOs should seek a balanced approach to prioritizing generative AI initiatives, including defining governance, identifying short-term efficiencies, and seeking longer-term transformation opportunities.