Azure

Microsoft Azure now allows free egress of data transfer out to the internet when leaving Azure

Earlier this week, Microsoft announced that it would follow suit with Google and Amazon in eliminating Azure egress fees. The following outlines Azure’s commitment to customer choice and details the process for transferring data out of Azure with financial incentives.

Free Egress Offer

Azure now provides free data egress for customers leaving Azure, allowing them to transfer their data to another cloud provider or on-premises data center without incurring internet fees.

Eligibility and Process

The first 100GB/month is free globally. For additional data transfer, customers must contact Azure Support and follow specific instructions to be eligible for the credit.

Data Transfer Credit

Once the data transfer is complete and all associated Azure subscriptions are cancelled, Azure Support will apply the credit.

Compliance with Data Act

This policy aligns with the European Data Act and is available to all Azure customers worldwide.

Reference

https://azure.microsoft.com/en-us/updates/now-available-free-data-transfer-out-to-internet-when-leaving-azure

AIGitHubGitHub Copilot

GitHub Copilot Chat is now generally available

This week GitHub announced that GitHub Copilot Chat is now generally available. All GitHub Copilot users can now enjoy natural language-powered coding with Copilot Chat at no additional cost for both Visual Studio Code and Visual Studio. It is also free to verified teachers, students, and maintainers of popular open-source projects.

What is GitHub Copilot Chat? GitHub Copilot Chat is a chat interface that lets you interact with GitHub Copilot, an AI-powered coding assistant, within supported IDEs. You can use GitHub Copilot Chat to ask and receive answers to coding-related questions, get code suggestions, explanations, unit tests, and bug fixes, and learn new languages or frameworks.

GitHub Copilot Chat is powered by OpenAI’s GPT-4 model, which is fine-tuned specifically for dev scenarios. You can prompt GitHub Copilot Chat in natural language to get help with learning new languages or frameworks, troubleshooting bugs, writing unit tests, detecting vulnerabilities, and more. GitHub Copilot Chat also supports multiple languages, so you can communicate with it in your preferred language.

To get started with GitHub Copilot Chat, you need to sign up for GitHub Copilot and install the GitHub Copilot extension in your IDE. You can then access GitHub Copilot Chat from the sidebar and start chatting with it. In Visual Studio, go to the View menu and select “GitHub Copilot Chat” to show the pane.

Then ask it a question…

You can also check out the documentation and the FAQ for more information.

Enjoy!

Reference

GitHub Copilot Chat is now generally available for organizations and individuals – The GitHub Blog

Using GitHub Copilot Chat in your IDE – GitHub Docs

AzureEventsLearning

Microsoft Ignite 2023 Book of News

What is the Book of News? The Microsoft Ignite 2023 Book of News is your guide to the key news items that are announced at Ignite 2023.

AI, Copilot and Microsoft Fabric will have an overarching theme at this year’s conference as you will see throughout the sessions and announcements.

Some of my favourite announcements

Azure Cloud Native and Application Platform

Azure App Service

  • Single subnet support for multiple App Service plans is now generally available. Network administrators gain substantial reduction in management overhead thanks to the new capability enabling multiple service plans to connect to a single subnet in a customer’s virtual network.
  • WebJobs on Linux is now in preview. WebJobs is a popular feature of Azure App Service that enables users to run background tasks in the Azure App Service without any additional cost. Previously available on Windows, it will extend to Linux, enabling customers to run background or recurring tasks and do things like send email reports or perform image or file processing.
  • Extensibility support on Linux is now in preview. Previously available on Windows, it will allow Linux web apps to take advantage of third-party software services on Azure and connect to Azure Native ISV services more easily.
  • gRPC, a high-performance, open-source universal RPC framework that now provides full bi-directional streaming support and increased messaging performance over HTTP/2 for web apps running on App Service for Linux is generally available.

Azure Functions

  • Azure Functions now supports .NET 8 for applications using the isolated worker model. Support is now available for Windows and Linux on the consumption, elastic premium and application service plan hosting options. This update is generally available.
  • Flex Consumption Plan is a new Azure Functions hosting plan that will build on the consumption, pay-for-what’s-used, serverless billing model. It will provide more flexibility and customizability without compromising on available features. New capabilities will include fast and large elastic scale, instance size selection, private networking, availability zones and high concurrency control. Users can request access to the private preview.

Azure Container Apps

  • Dedicated GPU workload profiles: Users will be able to run machine learning models with Azure Container Apps as a target compute platform to build event driven intelligent applications to train models or derive data-driven insights. This feature is in preview.
  • Azure Container Apps landing zone accelerator: Simplifies building of a production-grade secured infrastructure at an enterprise scale to deploy fully managed, cloud-native apps and microservices. This feature is generally available.
  • Azure Container Apps code to cloud: Users will be able to focus on code and quickly take an application from source to cloud without the need to understand containers or how to package application code for deployment. This feature is in preview.
  • Vector database add-ins: Three of the most popular open-source vector database variants, Qdrant, Milvus and Weaviate, are now available in preview as add-ins for developers to get started in a fast and affordable way.

Azure Kubernetes Service

  • The release of Kubernetes AI toolchain operator automates LLM model deployment on AKS across available CPU and GPU resources by selecting optimally sized infrastructure for the model. It makes it possible to easily split inferencing across multiple lower-GPU-count VMs, increasing the number of Azure regions where workloads can run, eliminating wait times for higher-GPU-count VMs and lowering overall cost. Customers can also choose from preset models with images hosted by AKS, significantly reducing overall inference service setup time.
  • Additionally, Azure Kubernetes Fleet Manager enables multi-cluster and at-scale scenarios for AKS clusters. Platform admins who are managing Kubernetes fleets with many clusters often face challenges staging their updates in a safe and predictable way. This allows admins to orchestrate updates across multiple clusters by using update runs, stages and groups. This is now generally available.

Azure Event Grid

  • Azure Event Grid now supports additional capabilities to help customers capitalize on growing industry scenarios. A key part of this new functionality is the ability to deliver publish-subscribe messaging at scale, which enables flexible consumption patterns for data over HTTP and MQTT protocols. This capability is now generally available.
  • Pull delivery for event-driven architectures: This allows customers to process events from highly secure environments without configuring a public end point, controlling the rate and volume of messages consumed, while supporting much larger throughput. This feature is generally available.
  • Push delivery to Azure Event Hubs: Event Grid namespaces will support the ability to push events to Azure Event Hubs at high scale through a namespace topic subscription. This enables the development of more distributed applications to send discrete events to ingestion pipelines. This feature is in preview.
  • Increased throughput units: To help customers scale to meet the demands of these new scenarios, Event Grid has also increased the number of throughput units available in an Event Grid namespace to 40, meeting the needs of more data-intensive scenarios by providing more capacity. This feature is generally available.

Azure Communication Services

  • Azure AI Speech integration into Azure Communication Services Call Automation workflows, generally available in November, will enable AI-assisted experiences for customers calling into a business.
  • Azure Communication Services job router, generally available in early December, will simplify the development of routing capabilities for inbound customer communications and steer customers to the most suitable point of contact in a business.

Azure API Management

  • API Management’s Credential Manager, now generally available, simplifies the management of authentication and authorization for both professional developers and citizen developers.
  • Defender for APIs, a new offering as part of Microsoft Defender for Cloud – a cloud-native application protection platform (CNAPP), is now generally available. Natively integrating with Azure API Management, security admins gain visibility into the Azure business-critical APIs, understand and improve their security posture, prioritize vulnerability fixes and detect and respond to active runtime threats within minutes using machine learning-powered anomalous and suspicious API usage detections.

Azure Migrate

  • The Azure Migrate application and code assessment, now generally available, complements the Azure Migrate assessment and migration tool to help modernize and re-platform large-scale .NET and Java applications through detailed code and application scanning and dependencies detections. The tool offers a comprehensive report with recommended code changes for customers to apply a broad range of code transformations with different use cases and code patterns.

Azure Data Platform

General

  • Amazon S3 shortcuts, now generally available, allow organizations to unify their data in Amazon S3 with their data in OneLake. With this update, data engineers can create a single virtualized data lake for their entire organization across Amazon S3 buckets and OneLake – without the latency of copying data from S3 and without changing overall data ownership.
  • Azure Data Lake Storage Gen2 (ADLS Gen2) shortcuts are now generally available, empowering data engineers to connect to data from external data lakes in ADLS Gen2 into OneLake through a live connection with target data.

Azure SQL

  • Several new features and updates for Azure SQL will make the offering more cost-efficient, reliable and secure.

Microsoft Fabric

  • Microsoft Fabric, an integrated and simplified experience for a data estate on an enterprise-grade data foundation, is now generally available. Fabric enables persistent data governance and a single capacity pricing model that scales with growth, and it’s open at every layer with no proprietary lock-ins. Fabric integrates Power BI, Data Factory and the next generation of Synapse to offer customers a price-effective and easy-to-manage modern analytics solution for the era of AI.
  • Microsoft 365 data is now able to natively integrate to OneLake in the Delta Parquet format, the optimal format for data analysis. Microsoft 365 data was previously offered only in JSON format. With this new integration, Microsoft 365 data will be seamlessly joined with other data sources in OneLake, enabling access to a suite of analytical experiences for organizations to transform and gain insight from their data. This also means that AI capabilities built using Microsoft Fabric notebooks will now directly access Microsoft 365 data within OneLake. This update is in preview.
  • Microsoft Fabric is being infused with Azure OpenAI Service at every layer to help customers unlock the full potential of their data, enabling developers to leverage the power of generative AI against their data and assisting business users to find insights in their data. This feature is in preview.

Azure Cosmos DB

  • Dynamic scaling per partition/region, now in preview for new Azure Cosmos DB accounts, will allow customers to optimize for scale and cost in situations where partitioning is used to scale individual containers in a database to meet the performance needs of applications, or where multi-region configuration of Azure Cosmos DB is used for global distribution of data.
  • Microsoft Copilot for Azure integration in Azure Cosmos DB, now in preview, will bring AI into the Azure Cosmos DB developer experience. Specifically, this release enables developers to turn natural language questions into Azure Cosmos DB NoSQL queries in the query editor of Azure Cosmos DB Data Explorer. This new feature will increase developer productivity by generating queries and written explanations of the query operations as they ask questions about their data.
  • Azure Cosmos DB for MongoDB vCore, now generally available, allows developers to build intelligent applications in Azure with MongoDB compatibility. With Azure Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership and the familiar vCore architecture when migrating existing applications or building new ones. Azure Cosmos DB for MongoDB vCore is also introducing a free tier, which is a developer-friendly way to explore the platform’s capabilities without any cost. Learn more about the free tier.
  • In addition, a new Azure AI Advantage offer will help customers realize the value of Azure Cosmos DB and Azure AI together. Benefits include:
    • Savings up to 40,000 RU/s for three months on Azure Cosmos DB when using GitHub Copilot or Azure AI, including Azure OpenAI Service.
    • World-class infrastructure and security to grow business and safeguard data.
    • Enhanced reliability of generative AI applications by leveraging the speed of Azure Cosmos DB to retrieve and process data.
  • Vector search in Azure Cosmos DB MongoDB vCore, now generally available, allows developers to seamlessly integrate their AI-based applications with the data stored in Azure Cosmos DB. Vector search enables users to efficiently store, index and query high-dimensional vector data, eliminating the need to transfer the data to more expensive alternatives for vector search capabilities, such as vector databases.

SQL Server

  • Monitoring for SQL Server enabled by Azure Arc, now in preview, will allow customers to gain critical insights into their entire SQL Server estate across on-premises datacenter and cloud, optimize for database performance and diagnose problems faster. With this monitoring tool, customers will be empowered to switch from a reactive operation mode to a proactive one, further improving database uptime while reducing routine workloads.
  • Enhanced high availability and disaster recovery (HA/DR) management for SQL Server enabled by Azure Arc is now in preview. With Azure Arc, customers can now improve SQL Server business continuity and consistency by viewing and managing Always On availability groups, failover cluster instances and backups directly from the Azure portal. This new capability will provide customers with better visibility and a much easier and more flexible way to configure critical database operations.
  • Extended Security Updates for SQL Server enabled by Azure Arc is now generally available. Extended Security Updates for SQL Server, which provide critical security updates for up to three years after the end of extended support, are now available as a service through Azure Arc. With the Extended Security Update service, customers running older SQL Server versions on-premises or in multicloud environments can manage security patches from the Azure portal. Extended Security Updates enabled by Azure Arc give financial flexibility with a pay-as-you-go subscription model.

Azure Infrastructure

AI

  • Custom-built silicon for AI and enterprise workloads in the Microsoft Cloud
  • Today, Microsoft is announcing new custom silicon that complements Microsoft’s offerings with industry partners. The two new chips, Microsoft Azure Maia and Microsoft Azure Cobalt, were built with a holistic view of hardware and software systems to optimize performance and price.
  • Microsoft Azure Maia is an AI Accelerator chip designed to run cloud-based training and inferencing for AI workloads, such as OpenAI models, Bing, GitHub Copilot and ChatGPT.
  • Microsoft Azure Cobalt is a cloud-native chip based on Arm architecture optimized for performance, power efficiency and cost-effectiveness for general-purpose workloads.
  • Azure Boost is now generally available
  • One of Microsoft Azure’s latest and most significant infrastructure improvements, Azure Boost, is now generally available. Azure Boost enables greater network and storage performance at scale, improves security, and reduces servicing impact by moving virtualization processes traditionally performed by the host servers, such as networking, storage and host management, onto purpose-built hardware and software optimized for these processes. This innovation allows Microsoft to achieve the fastest remote and local storage performances in the market today, with a remote storage performance of 12.5 Gbps (gigabits per second) throughput and 650K IOPS (input/output operations per second) and a local storage performance of 17.3 Gbps throughput and 3.8M IOPS.
  • ND MI300 v5 virtual machines with AMD chips optimized for generative AI workloads
  • The ND MI300 v5 virtual machines are designed to accelerate the processing of AI workloads for high-range AI model training and generative inferencing, and will feature AMD’s latest GPU, the AMD Instinct MI300X.
  • NC H100 v5 virtual machines with the latest NVIDIA GPUs
  • The new NC H100 v5 Virtual Machine (VM) Series, in preview, is built on the latest NVL variant of the NVIDIA Hopper 100 (H100), which will offer greater memory per GPU. The new VM series will provide customers with greater performance, reliability and efficiency for mid-range AI training and generative AI inferencing. By maintaining more memory per GPU in the VM, customers increase data processing efficiency and enhance overall workload performance.

Azure Migrate

  • Azure Migrate, the service used to migrate to and modernize in Azure, is introducing discovery, business case analysis and assessment support for new workloads. This allows customers to analyze their configuration and compatibility for new use cases so they can determine appropriately sized Azure instances at optimal cost and without blockers.
  • Specific features, in preview, include Spring apps assessment, business case with management costs, business case and assessment with security and Windows and SQL ESU in business case and Web apps assessment, which is generally available.

Azure IoT Operations

Azure IoT Operations is a new addition to the Azure IoT portfolio that will offer a unified, end-to-end Microsoft solution that digitally transforms physical operations seamlessly from the cloud to the edge.

That unified approach consists of the following:

  • Management plane: One control plane to secure and govern assets and workloads across cloud to edge with Azure Arc.
  • Application development: Consistently build and deploy apps anywhere, in the cloud or at the edge.
  • Cloud-to-edge data plane: Seamless integration at the data level from asset to cloud and back again.
  • Common infrastructure: Customers can connect investments in the cloud with their on-premises resources.

Learn more about Accelerating Industrial Transformation with Azure IoT Operations

Azure Management and Operations

Azure Chaos Studio

Azure Chaos Studio, now generally available, provides a fully managed experimentation platform for discovering challenging issues through experiment templates, dynamic targets and a more guided user interface.

Azure AI Services

Azure AI Studio

  • Microsoft is launching the preview of its unified AI platform, Azure AI Studio, which will empower all organizations and professional developers to innovate and shape the future with AI.

Azure AI Vision

  • Liveness functionality and Vision SDK: Liveness functionality will help prevent face recognition spoofing attacks and conforms to ISO 30107-3 PAD Level 2. Vision SDK for Face will enable developers to easily add face recognition and liveness to mobile applications. Both features are in preview.
  • Image Analysis 4.0: This API introduces cutting-edge Image Analysis models, encompassing image captioning, OCR, object detection and more, all accessible through a single, synchronous API endpoint. Notably, the enhanced OCR model boasts improved accuracy for both typed and handwritten text in images. Image Analysis 4.0 is generally available.
  • Florence foundation model: Trained with billions of text-image pairs and integrated as cost-effective, production-ready computer vision services in Azure AI Vision, this improved feature enables developers to create cutting-edge, market-ready, responsible computer vision applications across various industries. Florence foundation model is generally available.

Azure Open AI Service

  • DALL·E 3: Imagine an AI model that can generate images from text descriptions. DALL·E 3 is a remarkable AI model that does just that. Users describe an image, and DALL·E 3 will be able to create it. DALL·E 3 is in preview.
  • GPT-3.5 Turbo model with a 16k token prompt length and GPT-4 Turbo: The latest models in Azure OpenAI Service will enable customers to extend prompt length and bring even more control and efficiency to their generative AI applications. Both models will be available in preview at the end of November 2023.
  • GPT-4 Turbo with Vision (GPT-4V): When integrated with Azure AI Vision, GPT-4V will enhance experiences by allowing the inclusion of images or videos along with text for generating text output, benefiting from Azure AI Vision enhancement like video analysis. GPT-4V will be in preview by the end of 2023.
  • GPT-4 updates: Azure OpenAI Service has also rolled out updates to GPT-4, including the ability for fine-tuning. Fine-tuning will allow organizations to customize the AI model to better suit their specific needs. It’s akin to tailoring a suit to fit perfectly, but in the world of AI. Updates to GPT-4 are in preview.

Azure AI Video Indexer

  • Video-to-text summary: Users will be able to extract the essence of video content and generate concise and informative text summaries. The advanced algorithm segments videos into coherent chapters, leveraging visual, audio and text cues to create sections that are easily accommodated in large language model (LLM) prompt windows. Each section contains essential content, including transcripts, audio events and visual elements. This is ideal for creating video recaps, training materials or knowledge-sharing.
  • Efficient Video Content Search: Users will be able to transform video content into a searchable format using LLMs and Video Indexer’s insights. By converting video insights into LLM-friendly prompts, the main highlights are accessible for effective searching. Scene segmentation, audio events and visual details further enhance content division, allowing users to swiftly locate specific topics, moments or details within extensive video.

This year’s Ignite was packed with lots of new announcements and features that I can’t wait to start using in my applications.

Enjoy!

Click here to read the Microsoft Ignite 2023 Book of News!

CommunityMVPPersonal Development

Callon Campbell awarded 2023-2024 Microsoft MVP in Azure

This month I received an exciting email from Microsoft that I was re-awarded for the 6th year in a row for the 2023 – 2024 Microsoft Most Valuable (MVP) award in Azure. Receiving the Microsoft MVP award is both a humbling and exciting experience. It means you’re a member of a select group of experts of just over 3,000 MVPs from around the world. Still, I like to think of it as doing something I’m passionate about with other like-minded individuals, having fun and always having something new to learn and share with the community.

The Microsoft MVP Award is an annual award that recognizes exceptional technology community leaders worldwide who actively share their high-quality, real-world expertise with users and Microsoft. All of us at Microsoft recognize and appreciate Callon’s extraordinary contributions and want to take this opportunity to share our appreciation with you.” – The Microsoft Most Valuable Professional (MVP) Award Team Microsoft Corporation

If you’re interested in learning about the Microsoft MVP program and seeing what it takes to become a Microsoft MVP, or how to get awarded, I encourage you to take a look at the Microsoft MVP website and also the following article on “How to become a Microsoft MVP” where they explain some of the details of the program.

To wrap up this post I would like to congratulate all the other newly awarded or renewed Microsoft MVPs all over the world! You truly are an amazing community and I’m truly humbled and honoured to be part of this group.

Enjoy!

References

Microsoft MVP Award

How to become a Microsoft MVP

Callon Campbell MVP Profile

AIAzureEventsLearning

Microsoft Build 2023 Book of News

What is the Book of News? The Microsoft Build 2023 Book of News is your guide to the key news items that are announced at Build 2023.

As expected there is a lot of focus on Azure and AI, followed by Microsoft 365, Security, Windows, and Edge & Bing. This year the book of news is interactive instead of being a PDF.

Some of my favourite announcements

Azure Cloud Native and Application Platform

Azure API Management

  • Azure API Center: A new service that will enable organizations to centralize and manage their portfolio of APIs, regardless of type, life cycle or deployment location. This update is in preview. Learn more about updates to Azure API Center.
  • WebSocket API passthrough: Allows users to manage, protect, observe and expose WebSocket APIs running in container environments with the API Management self-hosted gateway container. This is now generally available. Learn more about WebSocket API passthrough.
  • Self-hosted gateway support for Azure Active Directory (Azure AD) tokens: Users will be able to secure communication between the self-hosted gateway to Azure to download configuration using Azure AD tokens. This allows customers to avoid manually refreshing a gateway token that expires every 30 days. This update is generally available.

Azure Event Grid

  • Leverage HTTP to enable “pull” delivery of discrete events to provide more flexible consumption patterns at high scale.
  • Enable publish-subscribe via the MQTT protocol, enabling bidirectional communication at scale between Internet of Things (IoT) devices and cloud-based services.
  • Enables routing MQTT data to other Azure services and third-party services for further data analytics and storage.

These updates are now in preview. Learn more about public preview of MQTT protocol and pull message delivery in Azure Event Grid.

Azure Functions

  • Deploy containerized Azure Functions in an Azure Container Apps environment to quickly build event-driven, cloud-native apps leveraging built-in Dapr integrations for distributed, microservice-based serverless apps.
  • Maximize developer velocity using Azure Functions integrated programming model, write code using preferred programming language or framework that Azure Functions supports and get the built-in service integrations with triggers and bindings for a first-class, event-driven, cloud-native experience.
  • Run Azure Functions alongside other microservices, APIs, websites, workflows or any containerized app using an Azure Container Apps environment built for robust serverless scale, microservices and fully managed infrastructure.

Azure Kubernetes Service

  • Long-term support is now generally available, starting with Kubernetes 1.27. Once enabled, this provides a two-year support window for a specific version of Kubernetes. Kubernetes delivers new releases every three to four months to keep up with the pace of innovation in the cloud-native world. To give enterprises more control over their environment, long-term support for Kubernetes enables customers to stay on the same release for two years – twice as long as what’s possible today. This is a long-awaited development in the cloud-native, open-source community for customers who need the additional option for a longer upgrade runway with continued support and security updates.
  • Transactable Kubernetes apps, now generally available, allow AKS customers to explore a vibrant ecosystem of first- and third-party Kubernetes-ready solutions from Azure Marketplace, and purchase and securely deploy them on AKS with easy click-through deployments. Conveniently integrated with Azure billing, these solutions are ready to use, taking advantage of all the benefits of running on a cloud-native platform like AKS.
  • Confidential containers in AKS, now in preview, is a first-party offering that will allow teams to run standard unmodified containers, aligned with the Kata Confidential Containers open-source project, to achieve zero trust operator deployments with AKS. These containers can be integrated with the typical services used by apps running on AKS for monitoring, logging, etc. in a trusted execution environment (TEE), with each pod assigned its own memory encryption key, providing hardware-based confidentiality and integrity protections, underscoring Microsoft’s focus on enterprise-readiness for these workloads. Learn more about confidential containers in AKS.
  • The multi-cluster update in Azure Kubernetes Fleet Manager (Fleet), in preview, will enable multi-cluster and at-scale scenarios for AKS clusters. The new multi-cluster update feature gives teams the ability to orchestrate planned updates across multiple clusters for a consistent environment.

Azure Communication Services

  • A new set of application programming interfaces (APIs), generally available next month, will help developers build server-based, intelligent calling workflows into their apps and simplify the delivery of personalized customer engagement with additional AI capabilities from Azure Cognitive Services.
  • Call automation interoperability into Microsoft Teams will be in preview next month for businesses that want to connect experts who use Teams into existing customer service calls. From daily appointment bookings and order updates to complex customer outreach for marketing and customer service, call automation with Azure Communication Services is changing the landscape of customer engagement.

Azure Data Platform

Microsoft Fabric

  • Microsoft Fabric, now in preview, delivers an integrated and simplified experience for all analytics workloads and users on an enterprise-grade data foundation. It brings together Power BI, Data Factory and the next generation of Synapse in a unified software as a service (SaaS) offering to give customers a price-effective and easy-to-manage modern analytics solution for the era of AI. Fabric has experiences for all workloads and data professionals in one place – including data integration, data engineering, data warehousing, data science, real-time analytics, applied observability and business intelligence – to increase productivity like never before.
  • To further enable organizations to accelerate value creation with their data, Microsoft is integrating Copilot in Microsoft Fabric, in preview soon, to enable the use of natural language and a chat experience to generate code and queries, create AI plugins using a low/no-code experience, enable custom Q&A, tailor semantics and components within the plugin and deploy to Microsoft Teams, Power BI and web. With AI-driven insights, customers can focus on telling the right data story and let Copilot do the heavy lifting.
  • Organizational data is hosted on Microsoft’s unified foundation, OneLake, which provides a single source of truth and reduces the need to extract, move or replicate data, helping eliminate rogue data sprawl. Fabric also enables persistent data governance and a single capacity pricing model that scales with growth, and it’s open at every layer with no proprietary lock-ins. Deep integrations with Microsoft 365, Teams and AI Copilot experiences accelerate and scale data value creation for everyone. From data professionals to non-technical business users, Fabric has role-tailored experiences to empower everyone to unlock more value from data.

Azure Cosmos DB

  • ​Burst capacity: Developers can achieve better performance and productivity with burst capacity, which allows customers to utilize the idle throughput capacity of their database or container to handle traffic spikes. Databases using standard provisioned throughput with burst capacity enabled will be able to maintain performance during short bursts when requests exceed the throughput limit. This gives customers a cushion if they’re under-provisioned and allows them to experience fewer rate-limited requests. This update is generally available.
  • ​Hierarchical partition keys: More efficient partitioning strategies and improved performance are made possible by hierarchical partition keys, which enable up to three partition keys to be used instead of one. This removes the performance trade-offs that developers often face when having to choose a single partition key and enables more optimal data distribution and high scale. This is generally available.
  • ​Materialized Views for Azure Cosmos DB for NoSQL: With Materialized Views, now in preview, users will be able to create and maintain secondary views of their data in containers that are used to serve queries that would be too expensive to serve with an existing container. Materialized Views can easily create and maintain data between two containers.
  • ​Azure Cosmos DB “All versions and deletes” change feed mode: Developers will be able to get a full view of changes to items occurring within the continuous backup retention period of their account, saving time and reducing app complexity. This is in preview.
  • ​.NET and Java SDKs Telemetry + App Insights: Monitoring apps will be easier with this update, now in preview. The Azure Cosmos DB .NET and Java SDKs support distributed tracing to help developers easily monitor their apps and troubleshoot issues, thereby improving performance and developer productivity.

Windows Platform

  • After listening to developer feedback, Microsoft created a home for developers on Windows with a renewed focus on productivity and performance across all stages of the development lifecycle. These features, now in preview, include:
  • Dev Home will allow users to quickly set up their machines, connect to GitHub and monitor and manage workflows in one central location. Dev Home is open source and extensible, allowing users to enhance their experience with a customizable dashboard and the tools they need to be successful. Users can also add GitHub widgets to track projects and system widgets to track CPU and GPU performance.
  • Windows Package Manager now includes WinGet configuration, which handles the setup requirements for an ideal development environment on a Windows machine using a WinGet configuration file, reducing device setup time from days to hours. Developers no longer need to worry about searching for the right version of the software, packages, tools or frameworks to download or settings to apply. WinGet configuration reduces this manual and error-prone process down to a single command with a WinGet configuration file.
  • Dev Drive is a new type of storage volume designed to provide developers with a file system that meets their needs for both performance and security. It is based on the Resilient File System (ReFS) and combined with a new performance mode capability in Microsoft Defender Antivirus provides up to 30% performance improvement in build times for file input/output (I/O) scenarios over the in-market Windows 11 version. The new performance mode is more secure for developer workloads than folder or process exclusions, providing a solution that balances security with performance.
  • Windows Terminal is getting smarter with GitHub Copilot X. Users of GitHub Copilot will be able to take advantage of natural language AI both inline and in an experimental chat experience to recommend commands, explain errors and take actions within the Terminal app. Microsoft is also experimenting with GitHub Copilot-powered AI in other developer tools like WinDBG to help developers’ complete tasks with less toil.

Click here to read the Microsoft Build 2023 Book of News!

Enjoy!

AzureThis week on Azure Friday

Azure Confidential Computing with Confidential VMs and AKS nodes | This week on Azure Friday

In this episode of Azure Friday, Amar Gowda joins Scott Hanselman to show how Azure Confidential Computing protects data in use and helps you achieve data security and data privacy goals within a managed cloud environment. Confidential VM’s protect VM-based workloads with memory encryption and code integrity for VM and container workloads. Attestation helps you remotely verify the entire VM is a hardware-based Trusted Execution Environment (TEE).

Chapters

  • 00:00 – Introduction
  • 01:20 – Azure Confidential Computing
  • 04:00 – Creating a confidential VM from CLI demo
  • 06:28 – Attestation sample client to verify hardware
  • 12:00 – Extensions and CVM on AKS support
  • 13:09 – AKS demo
  • 20:00 – Wrap-up

Source: Azure Friday

Resources

AzureCloudThis week on Azure Friday

Introducing Azure Cosmos DB for PostgreSQL | This week on Azure Friday

In this episode of Azure Friday, Kirill Gavrylyuk and Charles Feddersen join Scott Hanselman to explain Distributed SQL PostgreSQL in Azure Cosmos DB.

Chapters

  • 00:00 – Introduction
  • 02:12 – Azure Cosmos DB for PostgreSQL demo
  • 09:36 – Demo loading data from Azure Storage and sharding
  • 15:34 – Demo creating a replica for reduced latency
  • 21:56 – Wrap-up

Source: Azure Friday

Resources

AzureCloudEvents

Microsoft Ignite 2022 —

Judson Althoff EVP, Chief Commercial Officer officially started the Microsoft Ignite and the below is the capture of that moment. Microsoft Ignite kicked off today few hours ago and it is up and running live now and there were some great announcements came through so far from the sessions. I will be going to cover […]

Microsoft Ignite 2022 —
.NETAzure

Deploying a Blazor WebAssembly App to Azure App Service — The Code Blogger

In previous articles, we have covered various basic aspects of Blazor WebAssembly application. In this article, we are going to demonstrate how the Blazor WebAssembly app can be deployed in an Azure App Service. What are various options for deploying Blazor Apps ? There are two different types of Blazor applications – Blazor Server Apps…

Deploying a Blazor WebAssembly App to Azure App Service — The Code Blogger
AzureCloudThis week on Azure Friday

Develop tools for developing with Azure Cosmos DB | This week on Azure Friday

In this episode of Azure Friday, Estefani Arroyo joins Scott Hanselman to talk about and demo Azure Cosmos DB desktop tools for developing, querying and testing your applications. The Azure Cosmos DB Linux Emulator provides a high-fidelity emulation of the Azure Cosmos DB service. The Azure Data Studio Cosmos DB API for Mongo DB extension enables you to connect to your Mongo resources and query your data using the mongo shell.

Chapters

  • 00:00 – Introduction
  • 01:03 – Emulator configuration options
  • 02:20 – Emulator pre-requisites
  • 04:36 – Adding certificates
  • 07:20 – Azure Cosmos DB emulator
  • 08:31 – Querying data
  • 09:27 – Python sample app to try it
  • 10:30 – Visualizing data with Azure Data Studio
  • 12:26 – Mongo shell
  • 15:13 – Wrap-up

Source: Azure Friday

Resources