Tag: Ignite 2023

AzureEventsLearning

Microsoft Ignite 2023 Book of News

What is the Book of News? The Microsoft Ignite 2023 Book of News is your guide to the key news items that are announced at Ignite 2023.

AI, Copilot and Microsoft Fabric will have an overarching theme at this year’s conference as you will see throughout the sessions and announcements.

Some of my favourite announcements

Azure Cloud Native and Application Platform

Azure App Service

  • Single subnet support for multiple App Service plans is now generally available. Network administrators gain substantial reduction in management overhead thanks to the new capability enabling multiple service plans to connect to a single subnet in a customer’s virtual network.
  • WebJobs on Linux is now in preview. WebJobs is a popular feature of Azure App Service that enables users to run background tasks in the Azure App Service without any additional cost. Previously available on Windows, it will extend to Linux, enabling customers to run background or recurring tasks and do things like send email reports or perform image or file processing.
  • Extensibility support on Linux is now in preview. Previously available on Windows, it will allow Linux web apps to take advantage of third-party software services on Azure and connect to Azure Native ISV services more easily.
  • gRPC, a high-performance, open-source universal RPC framework that now provides full bi-directional streaming support and increased messaging performance over HTTP/2 for web apps running on App Service for Linux is generally available.

Azure Functions

  • Azure Functions now supports .NET 8 for applications using the isolated worker model. Support is now available for Windows and Linux on the consumption, elastic premium and application service plan hosting options. This update is generally available.
  • Flex Consumption Plan is a new Azure Functions hosting plan that will build on the consumption, pay-for-what’s-used, serverless billing model. It will provide more flexibility and customizability without compromising on available features. New capabilities will include fast and large elastic scale, instance size selection, private networking, availability zones and high concurrency control. Users can request access to the private preview.

Azure Container Apps

  • Dedicated GPU workload profiles: Users will be able to run machine learning models with Azure Container Apps as a target compute platform to build event driven intelligent applications to train models or derive data-driven insights. This feature is in preview.
  • Azure Container Apps landing zone accelerator: Simplifies building of a production-grade secured infrastructure at an enterprise scale to deploy fully managed, cloud-native apps and microservices. This feature is generally available.
  • Azure Container Apps code to cloud: Users will be able to focus on code and quickly take an application from source to cloud without the need to understand containers or how to package application code for deployment. This feature is in preview.
  • Vector database add-ins: Three of the most popular open-source vector database variants, Qdrant, Milvus and Weaviate, are now available in preview as add-ins for developers to get started in a fast and affordable way.

Azure Kubernetes Service

  • The release of Kubernetes AI toolchain operator automates LLM model deployment on AKS across available CPU and GPU resources by selecting optimally sized infrastructure for the model. It makes it possible to easily split inferencing across multiple lower-GPU-count VMs, increasing the number of Azure regions where workloads can run, eliminating wait times for higher-GPU-count VMs and lowering overall cost. Customers can also choose from preset models with images hosted by AKS, significantly reducing overall inference service setup time.
  • Additionally, Azure Kubernetes Fleet Manager enables multi-cluster and at-scale scenarios for AKS clusters. Platform admins who are managing Kubernetes fleets with many clusters often face challenges staging their updates in a safe and predictable way. This allows admins to orchestrate updates across multiple clusters by using update runs, stages and groups. This is now generally available.

Azure Event Grid

  • Azure Event Grid now supports additional capabilities to help customers capitalize on growing industry scenarios. A key part of this new functionality is the ability to deliver publish-subscribe messaging at scale, which enables flexible consumption patterns for data over HTTP and MQTT protocols. This capability is now generally available.
  • Pull delivery for event-driven architectures: This allows customers to process events from highly secure environments without configuring a public end point, controlling the rate and volume of messages consumed, while supporting much larger throughput. This feature is generally available.
  • Push delivery to Azure Event Hubs: Event Grid namespaces will support the ability to push events to Azure Event Hubs at high scale through a namespace topic subscription. This enables the development of more distributed applications to send discrete events to ingestion pipelines. This feature is in preview.
  • Increased throughput units: To help customers scale to meet the demands of these new scenarios, Event Grid has also increased the number of throughput units available in an Event Grid namespace to 40, meeting the needs of more data-intensive scenarios by providing more capacity. This feature is generally available.

Azure Communication Services

  • Azure AI Speech integration into Azure Communication Services Call Automation workflows, generally available in November, will enable AI-assisted experiences for customers calling into a business.
  • Azure Communication Services job router, generally available in early December, will simplify the development of routing capabilities for inbound customer communications and steer customers to the most suitable point of contact in a business.

Azure API Management

  • API Management’s Credential Manager, now generally available, simplifies the management of authentication and authorization for both professional developers and citizen developers.
  • Defender for APIs, a new offering as part of Microsoft Defender for Cloud – a cloud-native application protection platform (CNAPP), is now generally available. Natively integrating with Azure API Management, security admins gain visibility into the Azure business-critical APIs, understand and improve their security posture, prioritize vulnerability fixes and detect and respond to active runtime threats within minutes using machine learning-powered anomalous and suspicious API usage detections.

Azure Migrate

  • The Azure Migrate application and code assessment, now generally available, complements the Azure Migrate assessment and migration tool to help modernize and re-platform large-scale .NET and Java applications through detailed code and application scanning and dependencies detections. The tool offers a comprehensive report with recommended code changes for customers to apply a broad range of code transformations with different use cases and code patterns.

Azure Data Platform

General

  • Amazon S3 shortcuts, now generally available, allow organizations to unify their data in Amazon S3 with their data in OneLake. With this update, data engineers can create a single virtualized data lake for their entire organization across Amazon S3 buckets and OneLake – without the latency of copying data from S3 and without changing overall data ownership.
  • Azure Data Lake Storage Gen2 (ADLS Gen2) shortcuts are now generally available, empowering data engineers to connect to data from external data lakes in ADLS Gen2 into OneLake through a live connection with target data.

Azure SQL

  • Several new features and updates for Azure SQL will make the offering more cost-efficient, reliable and secure.

Microsoft Fabric

  • Microsoft Fabric, an integrated and simplified experience for a data estate on an enterprise-grade data foundation, is now generally available. Fabric enables persistent data governance and a single capacity pricing model that scales with growth, and it’s open at every layer with no proprietary lock-ins. Fabric integrates Power BI, Data Factory and the next generation of Synapse to offer customers a price-effective and easy-to-manage modern analytics solution for the era of AI.
  • Microsoft 365 data is now able to natively integrate to OneLake in the Delta Parquet format, the optimal format for data analysis. Microsoft 365 data was previously offered only in JSON format. With this new integration, Microsoft 365 data will be seamlessly joined with other data sources in OneLake, enabling access to a suite of analytical experiences for organizations to transform and gain insight from their data. This also means that AI capabilities built using Microsoft Fabric notebooks will now directly access Microsoft 365 data within OneLake. This update is in preview.
  • Microsoft Fabric is being infused with Azure OpenAI Service at every layer to help customers unlock the full potential of their data, enabling developers to leverage the power of generative AI against their data and assisting business users to find insights in their data. This feature is in preview.

Azure Cosmos DB

  • Dynamic scaling per partition/region, now in preview for new Azure Cosmos DB accounts, will allow customers to optimize for scale and cost in situations where partitioning is used to scale individual containers in a database to meet the performance needs of applications, or where multi-region configuration of Azure Cosmos DB is used for global distribution of data.
  • Microsoft Copilot for Azure integration in Azure Cosmos DB, now in preview, will bring AI into the Azure Cosmos DB developer experience. Specifically, this release enables developers to turn natural language questions into Azure Cosmos DB NoSQL queries in the query editor of Azure Cosmos DB Data Explorer. This new feature will increase developer productivity by generating queries and written explanations of the query operations as they ask questions about their data.
  • Azure Cosmos DB for MongoDB vCore, now generally available, allows developers to build intelligent applications in Azure with MongoDB compatibility. With Azure Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership and the familiar vCore architecture when migrating existing applications or building new ones. Azure Cosmos DB for MongoDB vCore is also introducing a free tier, which is a developer-friendly way to explore the platform’s capabilities without any cost. Learn more about the free tier.
  • In addition, a new Azure AI Advantage offer will help customers realize the value of Azure Cosmos DB and Azure AI together. Benefits include:
    • Savings up to 40,000 RU/s for three months on Azure Cosmos DB when using GitHub Copilot or Azure AI, including Azure OpenAI Service.
    • World-class infrastructure and security to grow business and safeguard data.
    • Enhanced reliability of generative AI applications by leveraging the speed of Azure Cosmos DB to retrieve and process data.
  • Vector search in Azure Cosmos DB MongoDB vCore, now generally available, allows developers to seamlessly integrate their AI-based applications with the data stored in Azure Cosmos DB. Vector search enables users to efficiently store, index and query high-dimensional vector data, eliminating the need to transfer the data to more expensive alternatives for vector search capabilities, such as vector databases.

SQL Server

  • Monitoring for SQL Server enabled by Azure Arc, now in preview, will allow customers to gain critical insights into their entire SQL Server estate across on-premises datacenter and cloud, optimize for database performance and diagnose problems faster. With this monitoring tool, customers will be empowered to switch from a reactive operation mode to a proactive one, further improving database uptime while reducing routine workloads.
  • Enhanced high availability and disaster recovery (HA/DR) management for SQL Server enabled by Azure Arc is now in preview. With Azure Arc, customers can now improve SQL Server business continuity and consistency by viewing and managing Always On availability groups, failover cluster instances and backups directly from the Azure portal. This new capability will provide customers with better visibility and a much easier and more flexible way to configure critical database operations.
  • Extended Security Updates for SQL Server enabled by Azure Arc is now generally available. Extended Security Updates for SQL Server, which provide critical security updates for up to three years after the end of extended support, are now available as a service through Azure Arc. With the Extended Security Update service, customers running older SQL Server versions on-premises or in multicloud environments can manage security patches from the Azure portal. Extended Security Updates enabled by Azure Arc give financial flexibility with a pay-as-you-go subscription model.

Azure Infrastructure

AI

  • Custom-built silicon for AI and enterprise workloads in the Microsoft Cloud
  • Today, Microsoft is announcing new custom silicon that complements Microsoft’s offerings with industry partners. The two new chips, Microsoft Azure Maia and Microsoft Azure Cobalt, were built with a holistic view of hardware and software systems to optimize performance and price.
  • Microsoft Azure Maia is an AI Accelerator chip designed to run cloud-based training and inferencing for AI workloads, such as OpenAI models, Bing, GitHub Copilot and ChatGPT.
  • Microsoft Azure Cobalt is a cloud-native chip based on Arm architecture optimized for performance, power efficiency and cost-effectiveness for general-purpose workloads.
  • Azure Boost is now generally available
  • One of Microsoft Azure’s latest and most significant infrastructure improvements, Azure Boost, is now generally available. Azure Boost enables greater network and storage performance at scale, improves security, and reduces servicing impact by moving virtualization processes traditionally performed by the host servers, such as networking, storage and host management, onto purpose-built hardware and software optimized for these processes. This innovation allows Microsoft to achieve the fastest remote and local storage performances in the market today, with a remote storage performance of 12.5 Gbps (gigabits per second) throughput and 650K IOPS (input/output operations per second) and a local storage performance of 17.3 Gbps throughput and 3.8M IOPS.
  • ND MI300 v5 virtual machines with AMD chips optimized for generative AI workloads
  • The ND MI300 v5 virtual machines are designed to accelerate the processing of AI workloads for high-range AI model training and generative inferencing, and will feature AMD’s latest GPU, the AMD Instinct MI300X.
  • NC H100 v5 virtual machines with the latest NVIDIA GPUs
  • The new NC H100 v5 Virtual Machine (VM) Series, in preview, is built on the latest NVL variant of the NVIDIA Hopper 100 (H100), which will offer greater memory per GPU. The new VM series will provide customers with greater performance, reliability and efficiency for mid-range AI training and generative AI inferencing. By maintaining more memory per GPU in the VM, customers increase data processing efficiency and enhance overall workload performance.

Azure Migrate

  • Azure Migrate, the service used to migrate to and modernize in Azure, is introducing discovery, business case analysis and assessment support for new workloads. This allows customers to analyze their configuration and compatibility for new use cases so they can determine appropriately sized Azure instances at optimal cost and without blockers.
  • Specific features, in preview, include Spring apps assessment, business case with management costs, business case and assessment with security and Windows and SQL ESU in business case and Web apps assessment, which is generally available.

Azure IoT Operations

Azure IoT Operations is a new addition to the Azure IoT portfolio that will offer a unified, end-to-end Microsoft solution that digitally transforms physical operations seamlessly from the cloud to the edge.

That unified approach consists of the following:

  • Management plane: One control plane to secure and govern assets and workloads across cloud to edge with Azure Arc.
  • Application development: Consistently build and deploy apps anywhere, in the cloud or at the edge.
  • Cloud-to-edge data plane: Seamless integration at the data level from asset to cloud and back again.
  • Common infrastructure: Customers can connect investments in the cloud with their on-premises resources.

Learn more about Accelerating Industrial Transformation with Azure IoT Operations

Azure Management and Operations

Azure Chaos Studio

Azure Chaos Studio, now generally available, provides a fully managed experimentation platform for discovering challenging issues through experiment templates, dynamic targets and a more guided user interface.

Azure AI Services

Azure AI Studio

  • Microsoft is launching the preview of its unified AI platform, Azure AI Studio, which will empower all organizations and professional developers to innovate and shape the future with AI.

Azure AI Vision

  • Liveness functionality and Vision SDK: Liveness functionality will help prevent face recognition spoofing attacks and conforms to ISO 30107-3 PAD Level 2. Vision SDK for Face will enable developers to easily add face recognition and liveness to mobile applications. Both features are in preview.
  • Image Analysis 4.0: This API introduces cutting-edge Image Analysis models, encompassing image captioning, OCR, object detection and more, all accessible through a single, synchronous API endpoint. Notably, the enhanced OCR model boasts improved accuracy for both typed and handwritten text in images. Image Analysis 4.0 is generally available.
  • Florence foundation model: Trained with billions of text-image pairs and integrated as cost-effective, production-ready computer vision services in Azure AI Vision, this improved feature enables developers to create cutting-edge, market-ready, responsible computer vision applications across various industries. Florence foundation model is generally available.

Azure Open AI Service

  • DALL·E 3: Imagine an AI model that can generate images from text descriptions. DALL·E 3 is a remarkable AI model that does just that. Users describe an image, and DALL·E 3 will be able to create it. DALL·E 3 is in preview.
  • GPT-3.5 Turbo model with a 16k token prompt length and GPT-4 Turbo: The latest models in Azure OpenAI Service will enable customers to extend prompt length and bring even more control and efficiency to their generative AI applications. Both models will be available in preview at the end of November 2023.
  • GPT-4 Turbo with Vision (GPT-4V): When integrated with Azure AI Vision, GPT-4V will enhance experiences by allowing the inclusion of images or videos along with text for generating text output, benefiting from Azure AI Vision enhancement like video analysis. GPT-4V will be in preview by the end of 2023.
  • GPT-4 updates: Azure OpenAI Service has also rolled out updates to GPT-4, including the ability for fine-tuning. Fine-tuning will allow organizations to customize the AI model to better suit their specific needs. It’s akin to tailoring a suit to fit perfectly, but in the world of AI. Updates to GPT-4 are in preview.

Azure AI Video Indexer

  • Video-to-text summary: Users will be able to extract the essence of video content and generate concise and informative text summaries. The advanced algorithm segments videos into coherent chapters, leveraging visual, audio and text cues to create sections that are easily accommodated in large language model (LLM) prompt windows. Each section contains essential content, including transcripts, audio events and visual elements. This is ideal for creating video recaps, training materials or knowledge-sharing.
  • Efficient Video Content Search: Users will be able to transform video content into a searchable format using LLMs and Video Indexer’s insights. By converting video insights into LLM-friendly prompts, the main highlights are accessible for effective searching. Scene segmentation, audio events and visual details further enhance content division, allowing users to swiftly locate specific topics, moments or details within extensive video.

This year’s Ignite was packed with lots of new announcements and features that I can’t wait to start using in my applications.

Enjoy!

Click here to read the Microsoft Ignite 2023 Book of News!