This year\u2019s Microsoft Ignite developer conference might as well be called AIgnite, with over half of the almost 600 sessions featuring artificial intelligence in some shape or form.\n\nGenerative AI, in particular, is at the heart of many of the new product announcements Microsoft is making at the event, including new AI capabilities for wrangling large language models (LLMs) in Azure, new additions to its Copilot range of generative AI assistants, new hardware, and a new tool to help developers deploy small language models (SLMs) too.\n\nHere\u2019s some of the top AI news CIOs will want to take away from Microsoft Ignite 2023.\n\n1. Bing Chat Enterprise goes away\n\nWhen OpenAI released ChatGPT Enterprise in September, there was speculation that it could cause trouble for Microsoft\u2019s Bing Chat Enterprise, launched just two months prior. Sure enough, Bing Chat Enterprise will soon disappear \u2014 but it\u2019s only the name that\u2019s going away: The product lives on and will be known simply as Copilot.\n\nWith the name change will come new capabilities, including \u2014 for organizations using Microsoft\u2019s Entra cloud-based identity management service \u2014 the ability to protect commercial data used within the chatbot. The new Copilot will be generally available from Dec. 1, 2023.\n\n2. Copilots a go-go\n\nOf course, Microsoft product naming could never be so simple, and there won\u2019t simply be one Copilot. There\u2019s also Copilot in Dynamics 365, Copilot for Microsoft 365, Copilot in GitHub, Copilot in Viva, and now: Copilot for Service and Copilot for Sales.\n\nCopilot for Service is intended to help agents in contact centers, ingesting customer information and knowledgebase articles and integrating with Teams, Outlook, and third-party systems, including Salesforce, ServiceNow, and Zendesk.\n\nConfusingly, Microsoft already offers a Sales Copilot; Copilot for Sales is a different product that includes a license for Copilot for Microsoft 365, and helps sales staff prepare for customer meetings by creating custom briefing documents.\n\n3. Copilots for sysadmins\n\nIt\u2019s not just Microsoft 365 users that get a copilot: Admins will have one too. A forthcoming update will see the addition of Copilot to the Edge for Business management interface, helping admins with recommended policies and extensions for the workplace browser. Other Microsoft 365 apps are already covered, including SharePoint and Teams. There\u2019s also a new adoption dashboard for Microsoft Viva to help track how the introduction of Copilot features in Microsoft 365 applications is changing the way users work.\n\n4. Additions to Copilot for Microsoft 365\n\nAbout those new features: Copilot is already starting to enrich Microsoft 365 apps, but there\u2019s a wave of new features arriving next year, so CIOs will need to be ready with answers to users\u2019 questions.\n\nStarting next year Teams, for example, will be able to take live meeting transcripts, summarize them as notes, and organize those notes on a whiteboard, suggesting more ideas to add to the whiteboard as the meeting progresses. Meeting notes will also become interactive documents, enabling participants to ask for more detailed information on a particular point after the meeting has ended. Organizations concerned about the risks of maintaining such written records will be able to turn the feature off by default or per meeting.\n\nThe additions to Copilot for Microsoft 365 won\u2019t stop there, though, as Microsoft is opening it up to plugins and connectors from third-party vendors, enabling it to source and cite data from Jira, Trello, Confluence, Freshworks, and others.\n\n5. Copilot Studio: A copilot for creating copilots\n\nInevitably, just as OpenAI has made it possible to customize chatbots \u2014 GPTs \u2014 by means of a ChatGPT-like interface, Microsoft has created Copilot Studio, a conversational copilot for creating and customizing more copilots. It will provide access to Azure features, including speech recognition and sentiment analysis, and the ability to add more sophisticated features through Power Platform connectors and Power Automate workflows, all with governance features so that IT is still in control.\n\n6. People don\u2019t want to give up their copilots\n\nEach year, Microsoft publishes its Work Trend Index, which this time included a survey and observational studies of early Copilot users.\n\nThose surveyed were clearly wowed by the tool: 70% said they were more productive with it, 77% said they didn\u2019t want to give it up, and 22% said the tool saved them more than 30 minutes a day.\n\n\u201cThese time savings are pretty extraordinary, but it will be key for everyone to invest their refunded time wisely,\u201d said Frank X. Shaw, Microsoft\u2019s chief communications officer, in a video presentation of the survey findings ahead of the event.\n\nThe observational study revealed that Copilot users were able to find information 27% faster and were able to catch up on missed meetings almost four times faster.\n\nPerhaps a future iteration of Copilot could coach users on what to do with the time it saves them: While around half of those saving more than 30 minutes a day with it said they spent the time saved on focused work, one-sixth said they spent it in \u2026 more meetings.\n\n7. Generative AI credentials\n\nThe domain is so new, it\u2019s hard to evaluate who knows what, so Microsoft is stepping in to offer new credentials in its Microsoft Applied Skills to encompass AI. They\u2019ll cover developing generative AI with Azure OpenAI Service; creating document processing systems with Azure AI Document Intelligence; building natural language processing tools with Azure AI Language; and building Azure AI Vision systems.\n\n8. Streamlining generative AI operations on Azure\n\nAt Build in May, Microsoft announced Azure AI Studio, a unified system for building generative AI applications, and six months later it\u2019s finally launching a preview of the platform. (Generative AI technology is advancing fast but not, it seems, all that fast.) Developers will be able to select from a range of proprietary and open-source LLMs; choose data sources, including Microsoft Fabric OneLake and Azure AI Search for vector embeddings, enabling responses to be fine-tuned with real-time data without having to retrain the whole model; and monitor their models\u2019 performance once deployed.\n\n9. New Azure chips for enterprise AI workloads\n\nMicrosoft is updating its Azure infrastructure with new chips tailored for AI workloads. To accelerate AI model training and generative inferencing, the ND MI300 v5 virtual machines will soon run on AMD\u2019s latest GPU, the Instinct MI300X, while the new NVL variant of Nvidia\u2019s H100 chip will power the NC H100 v5VMs, currently in preview. These will offer more memory per GPU to improve data processing efficiency.\n\nBut Microsoft is also adding custom chips of its own. It designed Azure Maia to accelerate AI training and inferencing workloads such as OpenAI models, GitHub Copilot, and ChatGPT. Maia has a companion, Azure Cobalt, for general (non-AI) workloads.\n\n10. Easier development of small gen AI apps with Windows AI Studio\n\nAzure AI Studio focuses on LLMs, but there\u2019s growing interest in the use of less resource-intensive generative AI models trained for specific tasks \u2014 and small enough to run locally on a PC or mobile device. To help developers to customize and deploy such SLMs, Microsoft will soon release Windows AI Studio, which will provide the option of running models in the cloud or on the network edge, and include prompt-orchestration capabilities to keep things in sync wherever they run.\n\n11. Using generative AI for knowledge management\n\nMicrosoft\u2019s Viva Engage enterprise communication tools offers a way for employees to learn from their peers by searching a database of answers to frequently asked questions provided by subject matter experts. An update to Answers in Viva due to roll out before year end will add an option to generate those answers \u2014 and even the questions they respond to \u2014 using AI, based on training files imported from other sources. This could offer enterprises a quick way to switch from a legacy knowledge management platform, or to share resources held in another system.