Tagged with Azure
I've been using Azure App Services (e.g. WebApps) for a few years now. I've been mostly happy with the result. Though I've had some trouble with the way that the App Service environment works from time to time (mostly with the version of .NET Core that is running).
To try and eliminate that (and possibly save some cost), I decided to switch my apps to use Docker Containers. I thought I'd share how I did it in case you want to do this as well.
This will be a three part series:
This blog has existed for 15 years now and I've moved it from server to server, service to service, in many forms over the years. As I moved servers, one of my biggest pains was copying all the images and downloads from server to server.
My site code took up about 1% of the space, and all those embedded images and downloads took a majority of the space. I was sick of it, especially on deploying the site (or saving the site in Git), so I decided to switch to storing it in Azure Storage (or AWS if you prefer).
So when I wrote my .NET Core version of the blog, I decided to bite the bullet and start storing them there. But I wanted to enable it directly from Blog authoring. I'm using my version of Metalog API middleware I wrote to do this (see more about that at Github). But I needed a small service to actually support saving new images to the storage service.
I run this blog and other sites on Azure App Services (used to be called Websites). As you might know all that code is open source on GitHub and I use that code to deploy directly to Azure.
I use the GitHub deployment that Azure offers so that every time I push a change to my master branch, it creates a new deployment for me. It's been pretty great, except...the deployment is pretty slow. Normally the speed of this deployment wouldn't matter a lot, except of course when I push a bug out to 'live'. Then the speed really matters.
I was perusing the builds and noticed that a build was taking 1014 seconds. That's an ASP.NET project with very little client-side building (e.g. no webpack or similar). Getting the source, doing the restore, building the project, and deploying it all shouldn't be taking 16+ minutes.
As you can see, I recently updated this blog. I wrote the new blog using ASP.NET Core RC1 (as related technologies) so when time came to deploy it, I had some issues.
At the time I thought it was Azure, but after testing with an empty project that worked, I figured it was probably something I did. In this post, I’ll talk about what I did to get it to work in Azure Websites.
There a couple of ways to do this. I preferred the publish from source approach. You can use the Visual Studio publish options too:
A while back, I decided that this blog deserved a clean coat of paint and since I’m digging into ASP.NET Core, it was logical to re-write it. I wanted more than just to change the look, I wanted to make some real changes to the code and finally open source the code too!
Open sourcing the code required that I do a few things. First of all, I had to change any code that I would be embarrassed by (not a trivial task), but also make it so that much of normal secrets weren’t exposed by open sourcing it (e.g. connection strings, etc.). But as of now, I’ve done it. The source is available and the site is live! I am sure that there are issues with the site, but hopefully I’ll iron those out as they crop up.
My goal here wasn’t to build a blog engine. This isn’t a carefully designed and architected solution that is easily skinned for your site, but it should be a good start to understand how to build a site with ASP.NET Core and related technologies. You can see the code here:
Let me start this post by saying I might not know what I am doing. It happens more than you might imagine. I love Azure Websites and use it pretty extensively for my ASP.NET hosting..this blog is even using it. Love it.
I also host a couple of Ghost blog sites using Azure Websites. This works sometimes…but usually it’s a nasty rash of trial and error and I often give up. Here’s the story of getting me and my wife’s blog using Ghost and Azure Websites that left me pulling out my hair yesterday.
This all started some time ago. Back a few months ago our blog went down for some unknown reason. Suddenly the same node.js version and Ghost version stopped working. As I was investigating, it seemed that if I reverted back to the original theme it fixed it. The original theme is boring so I hated the solution but I did it as I didn’t have time to dig in (we were on our honeymoon at the time). I just finished recording my new course for Pluralsight so I had a day free to find out what was happening.
My next stop this week was in the town of Gloucester in England. The group that ran this meetup was great and had everything setup to make this an easy talk to give. I especially want to thank Franck Terray and Sophie Lipowska for running the meetup.
For this stop, we talked about both ASP.NET Web API 2 as well as Azure Websites. I merged the two into a built API then deployed into Microsoft’s cloud. Lots of great questions later we stopped by the pub for a nice talk with the hard-core members. Great time was had by me.
Here are the promised resources from the talk:
I am a developer first. I’ve become my family’s IT department but not by choice. This is the fate of most developers I know.
For the past year or so I’ve been experimenting with Azure Websites as a solution for quick, one-off sites and even for class examples. I’m a big fan. Let me tell you why.
Azure is a diverse landscape with lots of services. It be a little daunting. I’ve some of these services, but certainly not all. Azure Websites is particularly interesting to me in that fulfills my desire to not be an IT guy. Why?
In this new course I build a new web site from scratch. I start out with a Bootstrap template (since my design skills suck) and move through creating content, building a database, exposing a REST-ful API and building a Single Page Application. I wrap it up by publishing the site to Azure Web Sites showing you how to not only get your application up an running in the cloud, but also how to monitor it and handle standard tasks like using your own domain in Azure.
You will see every line of code (with one small exception) that I write. This isn't slide-ware…it's show-n-tell. I use a range of technologies including:
I recently helped the Atlanta Code Camp effort by building them a new website. You can see it here: Atlanta Code Camp. I am pretty proud of what I was able to accomplish in the scant number of hours I had to build it. It's not done as we need to improve it when we have the speakers chosen and set up the schedule, but so far I am pretty happy with it.
I had a number of goals for the project:
My first thought was to start with a Mobile-First template and just build the site but as #4 was going to stymie that as PluralSight would really like me to finish my courses ;) So I started with a Bootstrap template (that I got from https://wrapbootstrap.com). This provided a good basis for the shell of the website. Before I did real color skinning of the site, I needed to wait for our logo. Dennis Estanislao did an amazing job on the logo. With that I was able to use the color scheme to change the template to match the logo and overall theme. But that was just the HTML part of the story.
I've ported my XBoxGames Database (see this blog article for copies of the .mdf files) to SQL Azure and added OData support. You can find the feed here:
If you are thinking about using SQL Data Services (the data part of the Azure stack) in your Silverlight 2 project, think again. As you might know, ADO.NET Data Services (Astoria) will not work cross-domain regardless of a security policy file (because of some limitations in the two networking stacks that Silverlight 2 uses). Its a problem but in most use-cases ADO.NET Data Services (Astoria) is used on the same domain so no biggie...but...
The Azure SQL Data Service uses Astoria to expose their data to the client...that means that with the ADO.NET Client Library that you can't access SQL Data Services. The reality is that since SQL Data Services requires basic authentication, it would not be terribly secure to call it in any case but this seals the deal.