Cloud Computing

Cloudy in Canada

Posted on Updated on

Today Microsoft announced two new datacentres in Canada:

“I’m very pleased to share the announcement of a major investment by Microsoft in locally delivered cloud services across two new Azure regions in the provinces of Ontario and Quebec coming in CY16. These facilities are home to the hardware that powers the Microsoft cloud and its capabilities.

Local delivery of cloud services supports our range of offerings including Microsoft Azure, our hyperscale, enterprise-grade, and hybrid public cloud offering; Office 365, which puts Microsoft Office at your fingertips virtually anywhere, across all of your favorite devices; and Microsoft Dynamics, our leading Customer Relationship Management system.”

This announcement is very welcome as many organizations in Canada are a little cautious about using cloud services that do not guarantee data residency. There is a lot of confusion and concern around this issue. Much if the trepidation is probably not warranted but organizations, their auditors and lawyers are being overcautious and watching what happens in the ongoing battle with the DOJ in which Microsoft refuses to provide Office 365 emails from an Irish datacentre.

If Microsoft is eventually compelled to deliver the emails in question, will it matter to Canadian organizations that there are datacenters in Quebec City and Toronto?

Data Sovereignty – A Lesson from the Future

Posted on Updated on

  1. Data stored overseas should be accessible to US government, judge rules– Source Reuters
  2. Obama administration contends that company with operations in US must comply with warrants for data, even if stored abroad– Source The Guardian

With the rulings this summer that Microsoft must provide the US government with customer data even if it is stored outside of the United States, many organizations and individuals alike are concerned about data sovereignty and privacy – And they should be however, legal issues like data sovereignty and Safe Harbor are distractions from the real issue.

Let’s start with a definition of Data Sovereignty:

Definition: Data sovereignty is the concept that information which has been converted and stored in digital form is subject to the laws of the country in which it is located.

– Source TechTarget

 

If you are at all concerned about data security and privacy, it’s not just legal jurisdictions that you need to be worried about. Consider some of the more high profile security breaches over the past few weeks (let alone the past year) in both cloud services and private data centers:

  1. Hundreds of Intimate Celebrity Pictures Leaked Online Following Alleged iCloud Breach– Source Newsweek
  2. Prosecutors: Accused Russian hacker jailed here had 2.1 million stolen credit card numbers when arrested– Source – Fox
  3. Data Breach Bulletin: Home Depot Credit Card Breach Could Prove To Be Larger Than Target Breach– Source Forbes
  4. Russian Hackers Amass Over a Billion Internet Passwords – Source New York Times

The message to me is that it doesn’t matter where the data is, it isn’t safe. In fact one could argue that while the US DOJ, SEC or IRS having access to your data is a privacy concern, it is less of a threat than a major security breach like Home Depot etc.

So what’s the answer?

Obviously this is a complex problem and large organizations with lots of smart people have been struggling with it for years. I don’t have a simple answer nor should you expect one. I know that many of the technology problems we faced in the past have been solved – and even seem quaint Remember having to rewind VHS movies before DVDs? Or returning DVDs before Netflix? Since I can’t travel to the future to tell you what the solution will eventually be, let’s look to somebody who has seen the future. Namely Captain Jack Harkness.

Captain Jack Harkness

He definitely doesn’t want to get caught with his pants down while saving the earth. Notice that he is wearing both suspenders (braces for our British readers) and a belt? So what can we learn from this?

While taking all of the precautions that you can with data center processes is an important part of a security strategy, some additional steps can also be taken. Consider data encryption. Yes, the data may still be accessed by unauthorized parties but the data will be of little use to them if they can’t decrypt it. In a private data center that has been compromised, the data may still be safe.

In public cloud environments, data can be encrypted before it enters the vendors cloud. The keys can reside in the client’s data center or in a third party escrow facility. In order for the data to be useful, a double breach would be necessary.

The same holds true for data sovereignty. Who cares if the DOJ has your data if they can’t read it.

Of course all of this assumes that the level of encryption being used is sufficiently strong that it is non-trivial to decrypt it through brute force or other means.

What do you think the future holds for data sovereignty and security?

New in Azure

Posted on Updated on

“New in Azure.” is a phrase I seem to be repeating a lot lately. Azure is constantly changing, evolving and getting better. The name has even changed from Microsoft Azure to Windows Azure. IaaS has been added. I recall last March I was doing a presentation to a medium sized audience. I had rehearsed my presentation the night before and during the presentation, an attendee asked me about running Oracle in Azure. I had heard the Microsoft and Oracle were partnering to try and make things easier for customers but I thought they questioner in the audience was pulling my leg. Really? Oracle on Microsoft Azure? When I logged in and showed the gallery, there they were. A series of Oracle instances ready to provision. They weren’t there the night before. So I used it as an opportunity to do two things:

  1. I told the audience that even one of Microsoft’s biggest competitors in the enterprise space has recognized the value of Azure and chose to be part of something that is growing rapidly.
  2. I told them that this is yet another example of how quickly things can evolve in the cloud and more good things were on tap soon.

I’m thankful that I was able to think quickly on my feet. Of course it was all true. And even more so now. There are new things arriving in Azure all the time. While I was at TechEd in Houston last month there was a series of new items announced in the Keynote. I can’t cover them all and frankly I’m not knowledgeable enough about them all to offer much insight. What I will do however, is let you know about two specific items that I’m excited about and the use cases that I see for them. If you want a complete list of the items announced you can find them in Scott Guthrie’s Blog.

Azure Remote App

The feature that I’m most excited about is Azure Remote App. Azure Remote App is very similar to Windows Remote App. It allows you to run an application on a server and access it through a thin client. From the perspective of the end-user the application appears to run as if it is installed locally but it is actually running on a server. Azure Remote app offers this functionality in a public cloud hosted environment with the option to run it in a hybrid model. The Azure based instance can still access on premise resources if you allow it to.

I’m excited about this for several reasons but mostly because it supports Android, iOS, Mac OS X and of course Windows based clients. I’m working with a lot of organizations that are experimenting with mobility solutions that include tablets and smart phones. This provides them a great opportunity to publish some applications with minimal provisioning requirements. They can pilot the application in Azure and either scale it out in Azure as needed or move it on premise for production.

You can try it out for free during the preview period. Let me know what you think about it.

Hybrid Connections

Another feature that I’m excited about is called Hybrid Connections. Hybrid Connections allow applications running in Azure to access enterprise datacenter resources and services securely and easily without having to poke holes in firewalls or use a VPN. It relies on a BizTalk Service (available in the free tier too). Consider the scenario that I described for Remote App – This makes rolling out an application for mobile users that requires access to on premise resources much easier.

You can learn more about Hybrid Connections using the following links posted in Scott’s blog:

 

Office365 for Fringe Use Case Scenarios

Posted on Updated on

Microsoft’s newest Billion Dollar business units include Office 365 and Azure. There’s lots of marketing, sales, and ROI information about Office365 and cloud services in general. So I’m not going to bore you with another post about how to save your organization money or accelerate value by adopting Office365. I’m going to describe two real world use cases that I have personally found Office365 to help with. I might even through in some anecdotal cost benefit analysis but my main purpose is to explore some less common uses for Office 365 that you may not have thought of.

The two scenarios are:

  1. External consultants
  2. Text and Development

External Consultants

I manage a team of consultants that regularly have to work at client sites. Often at some very security conscious organizations. We can’t always use our own laptops in their environment or if we can it is typically through guest wireless networks. We’ve encountered situations where the guest wireless prevents us from connecting back to our office through VPN. This makes it difficult to access some of our collaboration services like SharePoint. We have moved my team to Office365 specifically to do things like coauthoring documents in SharePoint from customer sites. This enables some interesting scenarios. We’ve had cases where an offsite consultant was able to review and update some documentation while it was being simultaneously authored by another consultant working in our lab.

Test and Development

We do a lot of System Center work. System Center is a complex suite of products that interact with each other as well as core Windows infrastructure like Active Directory and Exchange. When we are building out a proof of concept for a customer, they typically don’t want us to touch their production AD and Exchange environments. I don’t blame them. Ultimately in order to complete the project we would need to somehow build out an Active Directory and Exchange infrastructure dedicated to the proof of concept or pilot. Consider the additional costs in hardware, software, and time required to accomplish this. Lately we’ve started using Office365 to provide Exchange services. It takes minutes to provision and connect to. Examples we’ve used recently include the Exchange connector for Configuration Manager and Service Manager. Using this approach, in under and hour I was able to get more than a half dozen mobile devices loaded into Configuration Manager for a MDM/UDM proof of concept without touching any production AD or Exchange infrastructure simply by adding an additional email account the devices.

We’ve extended this to Azure as well. We have been using Azure to host System Center instances for proof of concept and sandbox deployments. I’m looking forward to combining Azure with Office365 to further accelerate our pilots and proofs of concept deployments.

Getting Started with Cloud Computing

Posted on Updated on

You’ve likely heard about how Office 365 and Windows Intune are great applications to get you started with Cloud Computing. Many of you emailed me asking for more info on what Cloud Computing is, including the distinction between “Public Cloud” and “Private Cloud”. I want to address these questions and help you get started. Let’s begin with a brief set of definitions and some places to find more info; however, an excellent place where you can always learn more about Cloud Computing is the Microsoft Virtual Academy.

Public Cloud computing means that the infrastructure to run and manage the applications users are taking advantage of is run by someone else and not you. In other words, you do not buy the hardware or software to run your email or other services being used in your organization – that is done by someone else. Users simply connect to these services from their computers and you pay a monthly subscription fee for each user that is taking advantage of the service. Examples of Public Cloud services include Office 365, Windows Intune, Microsoft Dynamics CRM Online, Hotmail, and others.

Private Cloud computing generally means that the hardware and software to run services used by your organization is run on your premises, with the ability for business groups to self-provision the services they need based on rules established by the IT department. Generally, Private Cloud implementations today are found in larger organizations but they are also viable for small and medium-sized businesses since they generally allow an automation of services and reduction in IT workloads when properly implemented. Having the right management tools, like System Center 2012, to implement and operate Private Cloud is important in order to be successful.

So – how do you get started? The first step is to determine what makes the most sense to your organization. The nice thing is that you do not need to pick Public or Private Cloud – you can use elements of both where it makes sense for your business – the choice is yours. When you are ready to try and purchase Public Cloud technologies, the Microsoft Volume Licensing web site is a good place to find links to each of the online services. In particular, if you are interested in a trial for each service, you can visit the following pages: Office 365, CRM Online, Windows Intune, and Windows Azure.

For Private Cloud technologies, start with some of the courses on Microsoft Virtual Academy and then download and install the Microsoft Private Cloud technologies including Windows Server 2008 R2 Hyper-V and System Center 2012 in your own environment and take it for a spin. Also, keep up to date with the Canadian IT Pro blog to learn about events Microsoft is delivering such as the IT Virtualization Boot Camps and more to get you started with these technologies hands on.

Finally, I want to ask for your help to allow the team at Microsoft to continue to provide you what you need. Twice a year through something we call “The Global Relationship Study” – they reach out and contact you to see how they’re doing and what Microsoft could do better. If you get an email from “Microsoft Feedback” with the subject line “Help Microsoft Focus on Customers and Partners” between March 5th and April 13th, please take a little time to tell them what you think

Cloud Computing Resources:

The Difference between Cloud Computing and Virtualizaton

Posted on Updated on

Lately I’m getting asked to explain the difference between virtualization and cloud computing.  I like answering this question because it shows that the person asking the question has at least enough knowledge to identify that there are similarities but that they are probably not the same.  Explaining the difference to this type of questioner is not usually a problem.

What bothers me a little more is when so called IT professionals use the terms VM and cloud interchangeably and then claim that they are pretty much the same thing or that if it works in one it should work in another.  It is easy to get into a debate and find specific examples to bolster most claims on either side.  Reality is not quite so simple.  The right answer will usually start with “it depends

With the rest of this post I’ll try to explain some of what it depends on and why there aren’t any simple answers. I’ll also give some examples of the beginnings of some more complicated answers without getting too technical.

The question worded a little differently:  Aren’t Virtualization and Cloud Computing the same thing?

Before we begin, let’s get a couple of things straight:

  1. Remember that Cloud Computing is a delivery model and that Virtualization is a technology.  Virtualization as a technology may be used at the back end of a service that is delivered with a Cloud Computing service offering but not necessarily.
  2. Virtualization is only one of the building blocks for cloud computing and there are many types of virtualization (server, desktop, application, storage, network, etc.) so categorical statements about virtualization and cloud computing is risky.  It really depends on what is being virtualized an how it is made available by the cloud provider.
  3. There are different Cloud styles (fabric based, instance based, etc.), service models (Saas, PaaS, IaaS) and deployment models (private, public, hybrid, community).  Thus, an answer with any significant depth that is correct when describing a fabric based community PaaS will most likely be incorrect when applied to an instance based private IaaS.

At the risk of oversimplifying let’s just consider a simple VM running on a bare metal hypervisor based virtualization platform.  Although the hypervisor abstracts the hardware and makes it available to the VM, the VM is still bounded by the physical server itself.  What I mean by this is that although you may be able to move a live VM from one physical server to another, the entire VM (memory and processor resources) must reside on one physical server and a single virtual LUN is required for storage.

Something very similar is the instance based cloud (in fact Amazon’s EC2 uses Xen based VMs at it’s core).  This one-to-many relationship between physical resources and user containers (call them VMs if you like but technically they should be referred to as instances) obviously puts limits on the linear scalability and redundancy of this cloud approach.  For many, this scalability limitation is offset by the ease of porting an application to an instance based cloud.

Fabric based clouds achieve higher scalability through the use of a fabric controller that keeps track all of the computing resources (memory, processor, storage, network, etc.) and allocates them as services to applications.  The physical resources can be distributed among many physical systems.  Again, at the risk of oversimplifying, the fabric controller is like an operating system kernel and the fabric itself acts similarly to a traditional OS as far as its relationship to a specific application.  Fabric based clouds have a Many-to-Many relationship that allows a many applications to use resources on many physical resources.  This model results in superior scalability and theoretically less downtime.  However, this comes at the cost of application compatibility as applications must be designed to run in a fabric.

So yes, in some instances (pun intended) cloud computing is just large scale server virtualization but cloud computing is not necessarily the same as virtualization and there are many examples of cloud computing that are significantly different from traditional virtualization.

Enhanced by Zemanta

I See a Cloud Router in your Future

Posted on Updated on

Yesterday I met with David Ker, one of the founders of RealWat Inc.  They currently offer the Ti-Took Nuage browser which is based on Google’s Chromium (the Chrome open source project).  The Nuage browser seeks to improve the browsing experience by adding improved privacy, security, speed, and other Web 2.0 cloud based services such as social bookmarking (more feature details here).  While the Ti-Took Nuage browser is interesting I’m unsure of the long term mass appeal it will have as other players (including the big browser shops) add similar functionality to their offerings, but for now Ti-Took is blazing a new trail.

Download a copy of the Nuage Browser here.

What got me more excited is a new project that they are working on called the Ti-Took Cloud Router.  It’s an innovative offering that essentially frontends an IaaS offering such as Amazon’s EC2.  The Ti-Took Cloud Router is targeted at small organizations that want to take advantage of cloud service offerings but still require security and scope of control.  Using the Cloud Router essentially created a virtual private cloud (vPC?) inside a public cloud that encapsulates the services that are important for an individual organizations business and users.  It also allows secure access to a virtual datacenter from public locations.  The key to all of this is their web based identity management service that provides a unified single sign-on that securely validates users into the vPC and then controls access to other cloud services like email or CRM.  We discussed the importance of extended authentication protocols and they assure me that they are investigating two factor authentication.

I foresee this type of offering accelerating the adoption of cloud services in the SMB space.  I’m looking forward to more announcements from RealWat.

The Public Private Disconnect

Posted on Updated on

I’ve been looking at public cloud offerings informally for a while and while I appreciate the case for a full cloud infrastructure, it will be quite some time before large enterprise datacenters can realistically retool everything for the cloud.  Sure, small startups can very successfully have large parts if not all of their IT services in the cloud but there are still too many barriers for larger organizations with large investments in “in-house” IT resources.

My initial thoughts were that there could be an opportunity for hybrid clouds that can provide organizations with excess capacity on demand.  This would be a great way to augment datacenters with cloud bursting opportunities and introduce organizations to cloud offerings with low risk.  Of course,  accomplish this, enterprises will need to build infrastructures that are compatible with public clouds.

The problem is that none of the (admittedly small sample) of private cloud infrastructures that I’ve looked at use the same APIs as their public cloud counterparts.  Until now that is …

Here’s an interesting announcement from Eucalyptus and Terracotta that essentially provides the management tools and a private cloud infrastructure that uses the same APIs as the Amazon AWS.  This is a good start and I hope and expect  to see more offerings that make it easy to build hybrid cloud solutions.

Read more here:  Terracotta and Eucalyptus Partner to Deliver High-Performance Data Scaling in Private Cloud Environments | Business Wire.

I expect that MS will soon announce on-premises availability of the Azure platform.

10 Cloud Computing Predictions For 2010

Posted on Updated on

10 Cloud Computing Predictions For 2010.

Wow – Way to go out on a limb.  Other than Sam Johnston (prediction 9 on slide 10), everyone is essentially predicting the growth of cloud offering and  adoption rates.

My prediction for 2010 – There will be a significant cloud security breach (or breaches) that slow down adoption rates for a time but ultimately highlight the missing pieces of most cloud service offerings.  Two of these missing pieces, service level agreements, and security services will become more important in closing new cloud business.

I know I haven’t gone much further out on the limb than the others but at least I have a chance of being wrong.

Let me know how accurate I was  in 2011.