Azure on My Mind

Posted on

It has been a while since I’ve written anything about Azure.  Not that there isn’t anything noteworthy happening.  There are constant updates to Azure so man in fact that it’s tough to keep track of them all.  Today I want to let you know about three items that are of special interest to me:

  1. The “New” Azure Portal
  2. The Azure Marketplace
  3. Project Nami

The New Azure Portal

Microsoft has had the new portal available in preview for about a year.  Depending on the azure services you needed to manage it could be confusing to know which portal to use.  Especially for new services that were only supported in the new portal.  Portal switching confusion is finally over (I hope) as the new portal is GA (generally available) as of December 3rd 2015.  It supports 26 services as opposed to the “classic” portal which supports 23.  Going forward you should be able to accomplish all of your Azure management tasks from the new portal.

The Azure Marketplace

There are many benefits to cloud computing but one of the features that really helps Azure stand apart for me is the Gallery.  The gallery allows you to quickly provision a virtual machine or other workload based on industry standard stacks.  There are thousands of certified applications pre-configured for Microsoft Azure in the Azure Marketplace.

I’ve used a several of them and they really accelerate my testing and investigation of new products and use cases.  I’ve traditionally focused on the Microsoft workloads like SQL Server and SharePoint but there are hundreds of third party offerings including Linux VMs and stacks from companies like Oracle, Barracuda, Citrix, Hortonworks, Commvault and many others.  That leads me to my last update …

Project Nami

Project Nami has been making my life a lot easier (and cheaper) for a few months now.  As you may well know, one of the most popular blogging and website platforms currently is WordPress.  WordPress requires a database and is designed to use MySQL.  Microsoft supports WordPress as a preconfigured application as part of the Azure Marketplace but you get a 20GB MySQL database provided by a third-party known as ClearDB.  If you exceed your database size allocation you can cripple your site and if you need more than 20GB you would need to pay ClearDB for additional services.

Project Nami to the rescue!  Being a Microsoft focused IT Pro I’m more comfortable with MS SQL server than MySQL.  Project Nami allows you to run your WordPress site on a MS SQL database instead of MySQL.  Even more interesting is that you can deploy directly to Azure in under 5 minutes.  I’m currently using Project Nami in Azure for hosting the website of the Surface Smiths podcast that I do with David Smith and it hasn’t let me down yet.

Cloudy in Canada

Posted on Updated on

Today Microsoft announced two new datacentres in Canada:

“I’m very pleased to share the announcement of a major investment by Microsoft in locally delivered cloud services across two new Azure regions in the provinces of Ontario and Quebec coming in CY16. These facilities are home to the hardware that powers the Microsoft cloud and its capabilities.

Local delivery of cloud services supports our range of offerings including Microsoft Azure, our hyperscale, enterprise-grade, and hybrid public cloud offering; Office 365, which puts Microsoft Office at your fingertips virtually anywhere, across all of your favorite devices; and Microsoft Dynamics, our leading Customer Relationship Management system.”

This announcement is very welcome as many organizations in Canada are a little cautious about using cloud services that do not guarantee data residency. There is a lot of confusion and concern around this issue. Much if the trepidation is probably not warranted but organizations, their auditors and lawyers are being overcautious and watching what happens in the ongoing battle with the DOJ in which Microsoft refuses to provide Office 365 emails from an Irish datacentre.

If Microsoft is eventually compelled to deliver the emails in question, will it matter to Canadian organizations that there are datacenters in Quebec City and Toronto?

Data Sovereignty – A Lesson from the Future

Posted on Updated on

  1. Data stored overseas should be accessible to US government, judge rules– Source Reuters
  2. Obama administration contends that company with operations in US must comply with warrants for data, even if stored abroad– Source The Guardian

With the rulings this summer that Microsoft must provide the US government with customer data even if it is stored outside of the United States, many organizations and individuals alike are concerned about data sovereignty and privacy – And they should be however, legal issues like data sovereignty and Safe Harbor are distractions from the real issue.

Let’s start with a definition of Data Sovereignty:

Definition: Data sovereignty is the concept that information which has been converted and stored in digital form is subject to the laws of the country in which it is located.

– Source TechTarget


If you are at all concerned about data security and privacy, it’s not just legal jurisdictions that you need to be worried about. Consider some of the more high profile security breaches over the past few weeks (let alone the past year) in both cloud services and private data centers:

  1. Hundreds of Intimate Celebrity Pictures Leaked Online Following Alleged iCloud Breach– Source Newsweek
  2. Prosecutors: Accused Russian hacker jailed here had 2.1 million stolen credit card numbers when arrested– Source – Fox
  3. Data Breach Bulletin: Home Depot Credit Card Breach Could Prove To Be Larger Than Target Breach– Source Forbes
  4. Russian Hackers Amass Over a Billion Internet Passwords – Source New York Times

The message to me is that it doesn’t matter where the data is, it isn’t safe. In fact one could argue that while the US DOJ, SEC or IRS having access to your data is a privacy concern, it is less of a threat than a major security breach like Home Depot etc.

So what’s the answer?

Obviously this is a complex problem and large organizations with lots of smart people have been struggling with it for years. I don’t have a simple answer nor should you expect one. I know that many of the technology problems we faced in the past have been solved – and even seem quaint Remember having to rewind VHS movies before DVDs? Or returning DVDs before Netflix? Since I can’t travel to the future to tell you what the solution will eventually be, let’s look to somebody who has seen the future. Namely Captain Jack Harkness.

Captain Jack Harkness

He definitely doesn’t want to get caught with his pants down while saving the earth. Notice that he is wearing both suspenders (braces for our British readers) and a belt? So what can we learn from this?

While taking all of the precautions that you can with data center processes is an important part of a security strategy, some additional steps can also be taken. Consider data encryption. Yes, the data may still be accessed by unauthorized parties but the data will be of little use to them if they can’t decrypt it. In a private data center that has been compromised, the data may still be safe.

In public cloud environments, data can be encrypted before it enters the vendors cloud. The keys can reside in the client’s data center or in a third party escrow facility. In order for the data to be useful, a double breach would be necessary.

The same holds true for data sovereignty. Who cares if the DOJ has your data if they can’t read it.

Of course all of this assumes that the level of encryption being used is sufficiently strong that it is non-trivial to decrypt it through brute force or other means.

What do you think the future holds for data sovereignty and security?

New in Azure

Posted on Updated on

“New in Azure.” is a phrase I seem to be repeating a lot lately. Azure is constantly changing, evolving and getting better. The name has even changed from Microsoft Azure to Windows Azure. IaaS has been added. I recall last March I was doing a presentation to a medium sized audience. I had rehearsed my presentation the night before and during the presentation, an attendee asked me about running Oracle in Azure. I had heard the Microsoft and Oracle were partnering to try and make things easier for customers but I thought they questioner in the audience was pulling my leg. Really? Oracle on Microsoft Azure? When I logged in and showed the gallery, there they were. A series of Oracle instances ready to provision. They weren’t there the night before. So I used it as an opportunity to do two things:

  1. I told the audience that even one of Microsoft’s biggest competitors in the enterprise space has recognized the value of Azure and chose to be part of something that is growing rapidly.
  2. I told them that this is yet another example of how quickly things can evolve in the cloud and more good things were on tap soon.

I’m thankful that I was able to think quickly on my feet. Of course it was all true. And even more so now. There are new things arriving in Azure all the time. While I was at TechEd in Houston last month there was a series of new items announced in the Keynote. I can’t cover them all and frankly I’m not knowledgeable enough about them all to offer much insight. What I will do however, is let you know about two specific items that I’m excited about and the use cases that I see for them. If you want a complete list of the items announced you can find them in Scott Guthrie’s Blog.

Azure Remote App

The feature that I’m most excited about is Azure Remote App. Azure Remote App is very similar to Windows Remote App. It allows you to run an application on a server and access it through a thin client. From the perspective of the end-user the application appears to run as if it is installed locally but it is actually running on a server. Azure Remote app offers this functionality in a public cloud hosted environment with the option to run it in a hybrid model. The Azure based instance can still access on premise resources if you allow it to.

I’m excited about this for several reasons but mostly because it supports Android, iOS, Mac OS X and of course Windows based clients. I’m working with a lot of organizations that are experimenting with mobility solutions that include tablets and smart phones. This provides them a great opportunity to publish some applications with minimal provisioning requirements. They can pilot the application in Azure and either scale it out in Azure as needed or move it on premise for production.

You can try it out for free during the preview period. Let me know what you think about it.

Hybrid Connections

Another feature that I’m excited about is called Hybrid Connections. Hybrid Connections allow applications running in Azure to access enterprise datacenter resources and services securely and easily without having to poke holes in firewalls or use a VPN. It relies on a BizTalk Service (available in the free tier too). Consider the scenario that I described for Remote App – This makes rolling out an application for mobile users that requires access to on premise resources much easier.

You can learn more about Hybrid Connections using the following links posted in Scott’s blog: