In this episode of Azure Friday, David Fowler and Liangying Wei join Scott Hanselman to show how to build real-time applications with WebSockets and Azure Web PubSub, a fully managed service that supports native and serverless WebSockets.
During my Global Azure Virtual 2020 live stream on Bringing serverless into the Enterprise, I had a few demo glitches. An inside joke for those that do presentations and demos is that the demo gods are either with you or against you. Some might say I didn’t offer up a satisfying sacrifice to the demo gods. I would argue and say I did but I feel it’s important to reflect and learn what went wrong and how I can be better prepared for the future by learning from our mistakes.
So knowing that I presented on this topic for the Global Azure 2020 Virtual event and had some failed demos, I wanted to explain what happned and why and how to be better prepared for a future talk and hopefully it’s a lesson that you can learn from for your own talks, presentations or just development efforts.
Back in February 2020 I had submitted a few topics for the Global Azure 2020 event. At this point in time COVID-19 was going on but the world hadn’t shut down like it is today and the Global Azure 2020 event was still going to happen. In March I was notified that my topic was selected and I had about 6 weeks to prepare. Fast forward to mid March and everything was starting to be cancelled or made virtual. In the case of our local Global Azure 2020 event it was cancelled, so I didn’t work on my presentation. I was invited to participate in another Global Azure 2020 Virtual Community event in UK and Ireland so I focused on that content.
About 2 weeks prior to the Global Azure 2020 event, I was notified we would be making our local event virtual and I had to confirm if I still wanted to participate. At this point I was not prepared and my wife ended up signing up for a course over the weekend prior to the event – which left me with 3 kids (10 months, 4 yrs and 7 yrs) to manage for 10 hours each day over a 3 day weekend. My initial thought was to excuse myself from the event, but I really wanted to participate and with COVID-19 and everything halted, I found it was important to maintain that community connection even if it was a virtual event.
So this took me back to my college days of doing school, working and squeezing in a project over a tight deadline – not fun but with coffee as my partner, I got the kids to bed and started putting in a couple late nights to get it all done…or so I thought.
With my talk this year being on bringing serverless into the Enterprise, I focused on Azure Functions and my demos were on the following topics to illustrate common enterprise use cases:
Using PowerShell in a Azure Function for automation tasks
Deploy code to Azure using GitHub Actions
Avoiding cold start and latency with Premium Functions
Monitoring logs for your Functions
My PowerShell Azure Function Failure
My first failed demo was something I knew was being problematic going into the talk but I felt it was important to still talk about and I had screenshots of a working state from previous attempts so felt good to proceed. The demo was creating an Azure Function with PowerShell. The issue was that no matter what PowerShell command I tried to run, I kept getting errors that it could not be run successful as shown below and no matter what I did I kept getting an error that the subscription could not be set.
Because you never know if something will go off the edge during a demo, you should always be prepared to go ‘offline’. By that I mean show screenshots of what you were trying to do and the expected outcome. You could even go so far as recording your demo and then switching to that during your talk. I’ve never done this but I’ve heard some people have and it worked perfectly. The audience had no idea the demo was broken and they were able to convey their message.
That might be a bit extreme, but I usually do take some screenshots of the Azure portal as part of my notes I use to prepare the presentation, so I know I can always fall back to that if necessary and in this case that is what I did. It’s unfortunate I could not show the feature working as I intended, but I let the audience know and continue to roll along.
My Premium Function Failure
This was my favorite demo I prepared for the talk and it involved creating an Azure Function and hosting it on the Premium plan and then comparing that to the Consumption plan to show scale, latency and that there is no more cold start in Azure Functions with the Premium plan.
When I prepared this demo it was before I worked on the GitHub Actions demo – which would have come prior to this in my presentation. The order of the demos plays an important role in why this failed so I’ll come back to this later.
In order to show the cold start and latency issues with the Azure Functions Consumption plan and how the Premium plan avoids this I was using a load testing site called Loader.io. This tool required that the host URL be verified with a special token that had to be returned from the site. In order to map my Azure Function result to the expected URL that loader.io wanted I needed to configure and Azure Function Proxy.
When I was setting up the demo I first setup the proxy in the portal, and then I moved it so a proxies.json file in the Visual Studio solution as shown here
When I was testing this demo I was able to verify the token and use loader.io to load test my consumption and premium functions without issue. After getting this demo done I moved on to the GitHub Action demo and took a copy of the code and used that for the CI/CD to push it up into Azure and that demo worked without issue. When I tested the automated deployment, I just tested the function and not the load testing.
You may have an idea of what caused the failed demo, but if not it’s related to the proxies.json file. When I copied the file into my solution I forgot to go to the properties and mark it as content to be deployed. So in the GitHub Actions demo that took place prior to the load testing demo, it would have deployed a fresh copy and removed the Proxy I had originally setup in the portal. This meant that if I needed to validate the token from loader.io, I wouldn’t be able to and thus I saw the following error in my demo and was a bit surprised.
I didn’t have or want to take the time to live debug to find out what was wrong as I feared I would go down a rabbit hole and totally derail my talk. So I moved on and explained as best as I could what would have happened…again I have screenshots but it wasn’t as cool as showing it live.
Testing, rehearse and what went wrong
When I look back at that presentation, I had under 2 weeks to prepare and I was still working on the talk the morning of to finish up a few areas. I would not have left things to the last minute as I did but things were very fluid in Feb/Mar with COVID-19 and I wanted to put my best effort in for the community and felt I could still manage it but under not so ideal circumstances.
I worked on each demo individually as they weren’t really related except for the GitHub Action demo. I should have done that first because I would have caught the token verification issue right away due to the missing proxy.
Speaking of token verification, it would seem its valid for 24 hours and as I got close to the talk I didn’t want to warm up my functions as I wanted them in a cold state. So not testing them right before my talk I missed out on seeing that the token just expired, which would have shown me that the proxy was missing.
Due to the time crunch when I rehearsed I didn’t do my demos inline with the presentation, I did them separately. Again had I done the demos with the presentation I would have potentially caught the expired token and missing proxy. It’s important to do an end to end test and walk through of the presentation material regardless how comfortable you feel you are.
In retrospect I should have gone back and tried to troubleshoot this issue at the end of my talk. As soon as I looked at the function I noticed the proxy was missing and I was able to add it quickly which would have looked like this…
This would have only taken me 5 minutes to troubleshoot and fix which would have allowed me to show the real demo. All in all the talk went well and I got some really good feedback. No one complained about the broken demos and I mentioned that I would follow up with the blog post to show what was wrong and how I fixed it. I was a bit disappointed that I couldn’t show this demo live as its pretty awesome to see, so look for a future blog post where I’ll setup a Premium function and throw some load at it – maybe I’ll even record it and post to YouTube.
I hope you enjoyed this post and found something useful. I find it’s important to acknowledge when we do run into issues and how we solve them.
In this episode of Azure Friday, Frank Boucher joins Scott Hanselman to talk about AzUrlShortener – an open source, budget-friendly URL shortener you can deploy directly from GitHub with one click into your Azure subscription. Frank explains how to get it, why it’s inexpensive, and explores the source code.
Working with Apache Kafka and want to simplify management of your infrastructure? In this episode of Azure Friday, Lena Hall joins Scott Hanselman to show you can keep using Apache Kafka libraries for hundreds of projects, and try Azure Event Hubs behind the scenes to focus on code instead of maintaining infrastructure.
In this episode of Azure Friday, Jeremy Likness joins Scott Hanselman to show how to build workflows and integrations from the cloud across third-party services and on-premises servers using Azure Logic Apps.
Modern applications light up with real-time information. In this episode of Azure Friday, Anthony Chu joins Donovan Brown to show how to deliver live updates from Azure Functions to web, mobile, and desktop apps with Azure SignalR Service. Learn how to send real-time messages over WebSockets from your serverless apps with a few lines of code.
Think serverless is just for functions? Think again! In this episode of Azure Friday, Brendan Burns joins Donovan Brown to look at how serverless containers can provide a cloud-native container experience without the worry of a server or operating system. They also look at how this integrates with the Azure Kubernetes Service (AKS).