Hybrid on premise / cloud architectures with the azure service bus relay

In this post I’ll take you through the steps involved in exposing your on-premise database and .NET code as a simple RESTful service using the service bus relay binding. I’ve also built out a an ASP.NET MVC client to consume the service.

All the source code is available here

https://github.com/aidancasey/CloudBurster

Background

Recently I’ve been delving into the world of hybrid on premise / cloud architecture. When it comes to start-up’s and companies building out new products the decision to build out a cloud solution is often be a no-brainer. But what about companies that have a significant investment in existing desktop / on-premise solutions? As much as we’d all love to, it’s not always possible to throw the baby out with the bath water and start again from scratch, enter the service bus relay binding!

I work for a large ISV that has a significant investment in an on premise line of business accounting system. The product has taken close to a decade to develop. Even with an aggressive online strategy it will take years to migrate this system to a true cloud solution. The service bus relay binding gives us the ability to quickly expose existing business logic and data to the cloud with only a small development overhead.

“I feel that Microsoft haven’t done themselves justice selling the service bus relay to the community”

Personally, I feel that Microsoft haven’t done themselves justice selling the service bus relay to the community. Its a really clever piece of technology and I’m surprised it isn’t being adopted more widely. I put the following solution together to demonstrate to the business how they could surface on-premise reports in the cloud.

Architecture

diagram – wrap existing business logic / stored procedure calls in a RESTful API exposed as a webHttpRelayBinding endpoint.

1. Creating a service namespace on Windows Azure

a) Log on to the Windows Azure Management Portal.

b) Click Service Bus, Create and enter the name of your service bus namespace (e.g “cloudburst”). For the best performance you should ensure your RESTful client is also deployed to the same Location, in this case US-West.

c) Once created click on Access Key and take note of the default issuer (“owner“) and default key(“Rd+I2mw7CaJ4pdJ7faf4yZKzI92PkYKVnE3qAA7QOIc=“). You’ll need to enter these into the app.config file in the project OnPremise.ServiceHost to enable the serviceHost to burst out to the service bus.

2. On premise service host

The Windows Azure Service Bus NuGet package pulls in all the service bus dependencies into your project. I’ve used the WebServiceHost to expose a RESTful service definition.

string serviceNamespace = "cloudburster";

Uri address = ServiceBusEnvironment.CreateServiceUri("https", serviceNamespace, "reports");

WebServiceHost host = new WebServiceHost(typeof(ReportingService), address);

host.AddDefaultEndpoints();
host.Open();

I’ve configured the relay binding to use the access key when establishing the relay binding.

3. Exposing a RESTful API

I’m exposing an API to query contact information from a legacy database. The RESTful API takes the following format:

RESOURCE URL VERBS
all contacts https://myobconnector.servicebus.windows.net/Contact/ GET
single contact https://myobconnector.servicebus.windows.net/Contact/{id} GET

The WebGet attribute  allows me to configure a JSON response type and to overlay a logical RESTful API over the WCF service contract.

    [ServiceContract(Name = "ContactContract", Namespace = "http://samples.microsoft.com/ServiceModel/Relay/")]
     public interface IContactService
    {

        [OperationContract]
        [WebGet(ResponseFormat = WebMessageFormat.Json, UriTemplate = "/{id}")]
        ContactEntity GetContact(string id);

        [OperationContract]
        [WebGet(ResponseFormat = WebMessageFormat.Json, UriTemplate = "/")]
        List GetAllContacts();

    }

4. Beware! there be dragons when running in a secured network !

Once the service host is running your data is exposed as a RESTful endpoint. For the purposes of this code sample I haven’t secured the client endpoint and I’m using a plain WebHttpBinding. This requires that the http ports 80/443 ports are open for outbound traffic on your network. If you are running in any sort of secured corporate environment you’ll likely run into firewall problems. This link will point you on the right path. This is one area where the documentation lets you down slightly. If you just read the brochures you’ll be led to believe that the relay binding can cope with NAT devices and internal firewalls but if your network administrator is doing their job properly you’ll likely need to get some firewall rules put in place.

5. consuming the API from a REST client (ASP.NET MVC)

Consuming the RESTful services is pretty straight , please refer to Cloud.App for a working solution. NewtonSoft’s free JSON serializer does a pretty reasonable job of hydrating your JSON payloads back into .NET types

        public ContactEntity Get(int Id)
        {

            string url = "https://cloudburst.servicebus.windows.net/contact/" + Id.ToString();

            using (WebClient serviceRequest = new WebClient())
            {
                string response = serviceRequest.DownloadString(new Uri(url));

                var data = JsonConvert.DeserializeObject(response);

                return data;
            }
        }

5. bench marking, performance & latency

work in progress – I’m working on a simple test harness to benchmark performance and latency in sending different sized payloads over a relay binding. From running the MVC rest client it looks like establishing the channel can be expensive (approx 1 second),  the first time but then subsequent service calls are pretty responsive. I’ll be publishing some test results soon. The plan is to build a simple ping service and instrument the timings.

Conclusion

I’m sure you’ll agree that its a pretty painless process to pick up your existing .NET code and start to expose it using the relay binding. There wasn’t a lot of examples out there so I decided to write this post and open source the code.This hybrid on-premise-cloud architecture has lots of possibilities in the real world.

It offers a pretty compelling alternative for companies that are slow to store their data in the cloud for data sovereignty issues (remember the data is still stored on premise) or for applications that need to surface some of their functionality to the cloud.

May the source be with you !

Aidan

MYOB Neo4J Coding Competition

Last week marked the end of the MYOB Neo4J coding competition. This was an internal competition for the development team in the Accountants Division of MYOB, to develop a customer relationship  system for accountants using node.js and Neo4J.  MYOB is one of the largest ISV in Australia and the team in the Accountants Division are focused on developing line of business applications for accounting practices.

A coding competition with a difference!

I wanted to have a level playing field for the competition so what better to throw at a bunch of Microsoft developers than a Neo4J, Node.js and Heroku challengeJ! The competition ran for 8 weeks and the challenge was to build an online CRM system that ingested a bunch of text files that represented data from a typical accounting practice.  The business domain was very familiar to the team but the technologies were all new.

To add another twist, points were awarded to the people within the team that made the biggest community contributions over the 8 weeks (MYOB ‘brown bag’ webinar sessions, yammer discussion threads and gists on GitHub). I wanted this to be a very open open-source competition!

Why Neo4J?

When you dig deeper and analyse the data that an accounting practice uses it’s all based around relationships – an accounting practice has employees, employees manage a bunch of clients, and these clients are often related (husband and wife, family trust etc). The competition gave the team a chance to dip their toes into the world graph databases and to see how naturally we could work with the data structures.

And the winner is Safwan Kamarrudin!

I’m pleased to announce that Safwan Kamarrudin is the winner and proud owner of a new iPad! Safwan’s solution entitled “Smörgåsbord” pulled together some really cool node.js modules including the awesome node-neo4j, socket.io and async. Safwan made a massive contribution to the competition community through the use of yammer posts, publishing GitHub Gists and by running brownbag sessions here in the office.

Accountants Division program manager Graham Edmeads presenting Safwan with his prize!

An interview with the winner!

Qn – So where did you come up with the name “Smörgåsbord”, are you a big fan of cold meat and smelly cheese?

I chose the name because the competition asked contestants to use a smorgasbord of technologies. Plus, I thought it would be cool to have umlauts in the name.

 Qn – Where can we find your solution on GitHub?

https://github.com/safwank/Smorgasbord

 Qn – Complete this sentence – Noe4J is completely awesome because ….

Data is becoming more inter-connected and social nowadays. While “relational” databases can be used to build such systems, they are definitely not the right tool for the job due to their one-size-fits-all nature (despite the name, relational databases are anything but relational). Modelling inter-connected data requires a database that is by nature relational and schema-free, not to mention scalable! And in the land of graph databases, in my opinion there is no database technology that even comes close to Neo4J in terms of its features, community and development model.

 Qn – What in your opinion is the biggest challenge to wrapping your head around Graph database concepts?

For someone who is more used to relational databases, the differences between nodes and tables need some getting used to. In a graph database, all nodes are essentially different and independent of each other. They just happen to belong to some indices or are related to other nodes.

This also relates to the fact that nodes of a similar type may not have a fixed schema, which can be good or bad depending on how you look at it.

Another subject that I had to grapple with was whether it makes sense to denormalize data in Neo4J. In a NoSQL database, normalization has no meaning per se. In some cases, data normalization even negates the benefits of NoSQL. Specifically, many NoSQL databases don’t have the concept of joins, so normalizing data entails having to make multiple round trips to the database from the application tier or resorting to some sort of map-reduce routine, which is inefficient and over engineered. Moreover, normalization assumes that there’s a common schema shared between different types of entities, and having a fixed schema is antithetical to NoSQL.

 Finally a word of thanks!

I’d like to say a huge thanks to Jim Webber, Chief Scientist at Neo Technologies for helping me launch the coding competition. Jim was struck down with chicken pox just hours before the competition was launched but he still managed to join me online to launch it and take the team through the infamous Dr Who use case. You are a legend Jim, many thanks!

May the source be with you!

Aidan

How an architect can build an exceptional software development team

I’ve had the pleasure of hiring and growing an awesome team of developers at MYOB Australia. In this post, I share my ideas for how  an architect can build an exceptional development team.

Hire craftsmen not programmers

Craftsmen take pride in their code down to the very last detail. They watch over the code and fix up the broken windows as they come across them.When you get the opportunity to hire new people don’t waste it with mediocrity. Dig deep in the interviews, understand how the candidate problem solves and if possible watch them code. If you have any doubts then keep on looking.

Hire great communicators

Great communication makes a team succeed . Make sure you hire candidates that prefer open honest conversations over lengthy email trails. You want developers that are comfortable at the whiteboard and can explain their ideas clearly. Encourage everyone on the team to have a voice and to respect each others opinions.As the architect your role it communicate the designs to everyone over and over again .

No big upfront architecture and design

Big upfront architectural fails – end of story. As the architect you need to set the vision and technical direction for the team but you must allow the designs to evolve and fall out naturally as your guys code out the features. Empower the entire team to make architectural decisions and guide them along the path.

Create a culture of continuous learning

Lunch time brown bag sessions are a fun and social way for your team to learn. Encourage everyone to present a session, don’t stick to the same presenters. We’re in the middle of a “20/20 brown bag series” at MYOB – 20 brown bag sessions in 20 weeks covering a wide range of topics, not just programming.

Try an internal social networking tool

Yammer is a great tool for helping like minded people connect and share ideas. Use some gentle persuasion to get everyone yammering – once people get it you can step back and watch the ideas flow.

Organise a coding competition

A coding competition is another fun way to get your team to think outside the square. We are pretty much an all Microsoft shop so I threw down the gauntlet and organised a challenge where the guys had to learn a whole new bunch of open source tools – Neo4J, Node.js and Heroku.

Be approachable – all the time

No matter how busy your day is, if you are at your desk and someone approaches you for help make time for them.

hope this helps

Aidan