Build a Hybrid Application with the Ionic Framework and Microsoft Azure Mobile Services

Introduction

In this post I’ll show you how to create a hybrid mobile application using the open source ionic framework and how to integrate the application securely to a Microsoft Azure Mobile Services backend. By integrating  with azure mobile service you get to connect to reliable cloud storage via a simple JavaScript SDK.

I’ve made the complete source code is available on GitHub  – feel free to fork the code or use it however you like.

Ionic Framework

Ionic is an open source JavaScript and CSS framework for building cross-platform HTML5 mobile applications. It’s built with Cordova and AngularJS and it comes with a nifty command life interface that lets you build  IOS and Android apps. My experience working with ionic has been really good. It is very easy to get started with ionic and there is a strong community behind with about 200 contributors to the codebase. Work is well underway to remove the dependency on AngularJS which means you’ll be able to plug the inoic framework into your JavaScript framework of choice in the near future.

Microsoft Azure Mobile Services

Microsoft Azure Mobile Services is a mobile backend as a service (MBaaS) offering from Microsoft. Almost every mobile application needs to store some data, deal with push notifications, service monitoring and authentication. With microsoft azure mobile services you get all of this as a “platform as a service” offering. You don’t need to spin up a single server to go live with a mobile app, simply provision yourself a mobile services backend in azure and  and you have access to all of these features along with a a fully programmable node.js or .NET backend.

The todo appapp I’ll take you through the key steps and code needed to build a simple todo mobile app. Users will authenticate with a google plus account. Once logged in they’ll be able to create new tasks and maintain a list of outstanding tasks.


Creating the ionic app

First up install apache cordova and the ionic framework via npm

npm install -g ionic
$ npm install -g cordova

Next,  create a new ionic application using one of the sidemenu starter template

$ ionic start Azure-Ionic-App sidemenu

That’s it! if you run the command “ionic serve” a browser will open running your application.The application we are building consists three simple views and controllers – login, add task, and  view all tasks.

Implementing Google Authentication

Firstly, create a new azure mobile service  via the azure portal. If you don’t have an account you can sign up for a free trial here. You’ll get more than enough credits with a trial account to develop  and test an application.

Once your mobile service is provisioned take a note of the mobile service url  on the dashboard page https://ionic-todo.azure-mobile.net this is your unique the API endpoint for your mobile app.

create mobile back end

The process of how to register your app for a google login is already well documented here . You’ll need to login to the google developer console, create a new web project and register for “Google +API” OAuth authentication then in the credentials tab, enter the authorization redirect URL for you application. This will be “your-mobile-services-url/login/google” , in my case its “https://ionic-todo.azure-mobile.net/login/google”google console

Now to associate your google application with the mobile service backend. Go back to the azure portal and enter the google client id and secret on the identity tab..

google register

To integrate a google login into your ionic application you’ll need to bundle the mobile services JavaScript client library with your application and add the following AngularJS factory class which returns a mobile service client object .

angular.module('azure', [])
  .factory('client', [function () {
    var client = new WindowsAzure.MobileServiceClient(
      "https://ionic-todo.azure-mobile.net/",
      "your-mobile-services-application-key"
    );
    return client;
  }]);

Using the client object we can take care of the OAuth flow by simply calling client.Login(“google”);


//the login controller calls client.login("google") to perform the oAuth dance
angular.module('azure', [])
angular.module('app.controllers', ['azure'])
.controller('LoginCtrl', function(client,$scope, $state) {
    $scope.login = function() {
      client.login("google").then(function succes(data){
        console.log('logged in succesfully..')
        $state.go('app.list');
      }, function(error){

        //login failed.
      });
    }
  })

The OAuth flow happens in a separate browser window.You must install the Cordova InAppBrowser plugin from to be able to show the login popup on a mobile device. Run the following command to install the plugin locally

$ ionic plugin add cordova-plugin-inappbrowser

then add the plugin to your config.xml file to bundle it as part of the ionic build process.

  <feature name="InAppBrowser">
    <param name="ios-package" value="CDVInAppBrowser"/>
    <param name="android-package" value="org.apache.cordova.inappbrowser.InAppBrowser"/>
  </feature>

Once logged in, a user context is set on the client object. You can access this at any time by simply injecting the client factory class we created earlier into the relevant controller; client.currentUser.userId

We’ll store the unique userId property as part of each todo item in the data store to make the solution multi-tenanted and to ensure you only get to see your own todo list tasks, I know I for one don’t want to complete other peoples tasks!

Storing Data in Azure Mobile Services

When you provisioned a mobile services on azure  you also provisioned a SQL Server backend but chances are you won’t treat the backend like a relational database. When you provision a new table in the database it gets allocated a dynamic schema by default ( very cool but you can turn this off if you like). As you make API calls to store data in a table new columns are generated based on the properties of the JSON object you send to the API.

Go to the data tab in mobile services and a create a table called “Task” to store the todo items. Set the table permissions for insert, update,delete and read operations to “authenticated users”.  You can now read and write data to this table via the mobile services client factory class we created earlier.

Its worth highlighting that the backend API that writes to the database table is fully programmable and you can write node.js scriptlets to validate and transform your data before it gets written to the data store. If you’d prefer to write your data to a MongoDB or DocumentDB data store its pretty easy to swap out the database completely  – more information here

For the purposes of this post we won’t create any server side scriptlets, we’ll let the data go straight through to the tasks table.To keep the ionic application more modular I created a separate AngularJS factory to class to connect to the azure backend.

There’s a few things worth highlighting in the code

  • when saving data in the addTask() function I tack on a userId property.
  • when reading data in the getAll() function I create a filter expression to filter to return the tasks for the logged in user.
angular.module('app.services', ['azure'])
  .factory('azureAPI', ['client' ,'$q', '$rootScope', function (client, $q, $rootScope) {

    return {
      getAll: function () {
        var deferred = $q.defer();

        var userId = client.currentUser.userId;

        //filter by user id
        client.getTable('Task').&amp;amp;lt;strong&amp;amp;gt;where({userId:userId})&amp;amp;lt;/strong&amp;amp;gt;.read().then(function () {
          deferred.resolve.apply(this, arguments);
          $rootScope.$apply();
        }, function () {
          deferred.reject.apply(this, arguments);
          $rootScope.$apply();
        });

        return deferred.promise;
      },

      addTask: function (task) {

        &amp;amp;lt;strong&amp;amp;gt;task.userId = client.currentUser.userId;&amp;amp;lt;/strong&amp;amp;gt;
        var deferred = $q.defer();

        client.getTable('Task').insert(task).then(function (data) {
          deferred.resolve.apply(this, arguments);
        }, function (error) {
          deferred.reject.apply(this, arguments);
        });
        return deferred.promise;
      },

      updateTask: function (task) {
        var deferred = $q.defer();
        task.userId = client.currentUser.userId;

        client.getTable('Task').update(task).then(function (data) {
          deferred.resolve.apply(this, arguments);
        }, function (error) {
          deferred.reject.apply(this, arguments);
        });
        return deferred.promise;
      }

    };
  }]);

That’s it. We’ve now hooked up google authentication and our mobile is storing data against a microsoft azure mobile services backend.

All the code is available here :

https://github.com/aidancasey/Azure-Ionic-App

Hope this helps

Advertisements

Microservices – Size Doesn’t Matter!

tape_measureI work as part of a team at MYOB that has been building a new platform for accountants for the past year using a microservices architecture. We’re now running approximately 20 microservices in production and we’ve learned a lot about the right size and granularity of microservices during the project.

“Microservice” is a very loaded word, it immediately evokes connotations about the size of what you are building – what is the right size and granularity for a microservice? If I go too large am I doing it all wrong? It’s gotta be small because it’s a microservice , right?

Well, when it comes to determining the right size for a microservice, unfortunately the answer is “it really depends”.

If you go too small you’ll end up with a very chatty system where a lot of processing time is spent waiting on remote calls between services. If you go too large then there is a danger that your code will become too complex and more difficult to maintain as more and more functionality gets added. Your services need to be the right size to make them composable and reusable.

So when you are determining the appropiate size for a microservice I think that it really depends on three things

  • the maturity of your team and automation
  • the complexity of your business domain
  • how long your project has been running

Size varies depending on the maturity of your team and your automation

Firstly, you need to acknowledge the fact that you are introducing an overhead every time you create a new service. Martin Fowler describes this as a microservice premium, which I think is a pretty appropiate term. This overhead is bigger or smaller depending on your team’s maturity with microservices and how good or bad your automation is. For example, if you have invested in building a template project with the right scaffolding to make a new service immediately deployable then the overhead to create a new service is much smaller but you need to invest in good automation to avoid the high microservice premiums. We’ve built a template project for this purpose and we are constantly looking for ways to reduce the friction involved in building and deploying our code.

Size varies depending on the complexity of your business domain

Lines of code isn’t a good way to measure the size of a microservice. Different programming languages are more or less expressive and even in the same programming language the number of lines of code it takes different developers to solve the same problem varies dramatically. It is more useful to focus on the functionality inside the microservice. If the microservice implementation is doing little more than exposing a basic CRUD style interface chances are you’ve gone too small and you will end up building an anaemic microservice. It is better to think in terms of buisness capabilities, think about the right bounded context that will give you a reusable business capability and take it from there.

Size varies depending on how long your project has been running

If you are in the early stages of a project it’s best to start with a smaller number of services. Chances are you don’t fully understand the problem domain and whatever you are building will invariably change as your understanding grows. If you are going to implement microservices you need to be prepared to refactor your services, combinining them and splitting them from time to time till you get the balance just right and you need to fully embrace an evolutionary design. Last year, one of our product teams started out with a single microservice and the guys are still happily working away on the same service. I’m pretty sure that the code will split in the future but it’s not time yet and they are wary of premature optimizations.

So there you have it, in summary if you plan to adopt microservices remember that size isn’t something that you should get hung up on. Its better to think in terms of where your team is at and how far along you are in your project.

Hope this helps !

(This article was first published on the UnderTheHood technology blog)

Sharing a folder between Windows VMWare workstation host & Ubuntu guest

So I lost a few hours of my life today trying to set up a folder share between a Windows 7 host machine running VMWare workstation 11.0 and an Ubuntu guest machine.
I went through the following steps  to link a new created shared windows folder .

After restarting the VM I ran the default wmware-config tools, accepting all the defaults

$sudo vmware-config-tools.pl

and ended up with the following compile time error from the vmhgfs module

“The filesystem driver (vmhgfs module) is used only for the shared folder
feature. The rest of the software provided by VMware Tools is designed to work
independently of this feature.

If you wish to have the shared folders feature, you can install the driver by
running vmware-config-tools.pl again after making sure that gcc, binutils, make
and the kernel sources for your running kernel are installed on your machine.
These packages are available on your distribution’s installation CD.
[ Press Enter key to continue ] “

After a lot hunting around I finally discovered that this was a known problem with older versions VMWares tooling , not sure how I managed  to recreate it but for what its worth here are the steps to fix the problem

(1)Ensure that latest Workstation 11 is installed.
(2)Uninstall open-vm-tools

$sudo apt-get remove open-vm-tools

(3) Check for any missing package updates

$sudo apt-get update

(4) Grab the following source

$sudo git clone https://github.com/rasa/vmware-tools-patches.git

(5) Untar & Patch !

$cd vmware-tools-patches
$sudo ./download-tools.sh
$sudo ./compile.sh

your shared folder should appear under the following directory

mnt/hgfs/

On choosing between Azure Database and Table Storage

How often does this happen –  your teammate has just started developing a  new service and they need to store some data  in the cloud. “I need a database, therefore I’ll use SQL Server”. All too often relational databases are introduced into an architecture without any thought being given to type of data that is being stored or the way it will be accessed.

When building on Widows Azure you’ve got access to a number of database technologies including  SQL Database, Table Storage and DocumentDB. Changing your storage technology can be a very costly exercise so its wise to spend the time upfront to consider where you data should really live.

SQL Database

SQL Database a relational database-as-a-service offering. Its great for storing relational data and supports a rich complex query operations. A single SQL Database can store up to 500GB of data. All of the database concepts are very familiar to anyone that has worked with SQL Server. SQL Database also offers features that enable migration, export, and ongoing synchronization of on-premises SQL Server databases with Azure SQL databases (through SQL Data Sync). If you need transnational support for batch operations then SQL Azure is the way to go.

But…. all of this it comes at a price! when compare to table storage SQL Database is far more expensive.

Table Storage

Table storage is a fault-tolerant NoSQL key-value store database.It comes in handy when you need to store very large amounts of non relational data. Where SQL Database has an upper limit of 500GB, Table storage maxes out at 200TB per table. The querying ability of table storage is much less rich. You store data by  partition key and row key. Its not possible to join tables so you always need to know the partition key to work with.

Check out this great blog post by Troy Hunt where he explains how he works with 150 million rows of data using table storage.

Technology Selection Considerations

you should first consider using table storage when..

  • you need to store terrabytes of data and cost if a factor
  • there are no complex relationships in the data that require server side joins and secondary indexes
  • you need to store unstructured data or the structure of every object may be different
  • you plan to scale out without sharding

you should first consider using SQL database when…

  • you are storing highly structured, relational data
  • you need to store less than 500GB of data
  • you need to access and query the data in many different ways
  • you want to be able to run your application on premise and in the cloud

Key Feature Comparison

Criteria SQL Database Table Storage
Relational data yes no
Transaction Support yes full ACID transactions very limited
Table Schema fixed each row can have different properties
Maximum Storage 500GB per database 200TB per table
REST client support yes yes
Availability SLA 99.9% per month 99.9% per month

Chrome DevTools – console.table() function

It’s the simple things in life that can really make you happy… like the chrome developer tools console.table() function. It was released in 2013 but I only learned about its awesomeness today!

Logging to the console the old way

way, way back in the day (i.e. yesterday) I was logging objects straight to the console

var nirvana = [
{ name: "Kurt Cobain", instrument: "guitar" },
{ name: "Krist Novoselic", instrument: "bass" },
 { name: "Dave Grohl", instrument: "drums" }
];

console.log(nirvana);

and expanding out tree views to see what was going on…

old skool

Logging to the console with console.tree()

Enter console.table()

var nirvana = [
{ name: "Kurt Cobain", instrument: "guitar" },
{ name: "Krist Novoselic", instrument: "bass" },
 { name: "Dave Grohl", instrument: "drums" }
];

console.table(nirvana);

A nice formatted table with sortable  columns.

new skool1

 

Thank you Chrome DevTools for caring !

Book Review – “Building Microservices” By Sam Newman

Designing Fine-Grained Systems

Microservices are currently a very hot topic in the software development. By choosing to build finely grained services, that are deployed independently and modeled around very specific business domains  you avoid a lot of the problems of traditional service-oriented architectures. However, there is no such thing as a free lunch when it comes to software development. If you decide to adopt a microservices architecture you need to invest heavily in automation and your DevOps capabilities.

Sam Newman does an excellent job in this book in explaining the key concepts of microservices. He tackles some really difficult topics like – service versioning, how to secure service endpoints, testing approaches , how small should you build your services and how to incrementally move away from a monolithic architecture.The book does a fantastic job of highlighting common pitfalls and areas of concern for when you *are* building such a system.

I’d highly recommend this book if you are currently building out microservices or thinking about going down this path in the near future. I’ve been working on a fairly large microservices implementation for the past 6 months and I wish this book had been around when we started on our journey.

http://shop.oreilly.com/product/0636920033158.do

 

 

 

Enabling remote desktop on an Ubuntu VM in Windows Azure

The Ubuntu VM’s available from the gallery on Windows Azure don’t come with a  desktop GUI – you need to work with the server using an ssh client like putty. There is no harm in setting up an alternative way to connect just in case you have a runaway server. Below are the steps involved to set up RDP access.

Install Ubuntu Desktop

First  you need to install the desktop

sudo apt-get install ubuntu-desktop

Install xrdp package

sudo apt-get install xrdp

Enable an RDP endpoint from the azure management portal

On to the management portal you need to enable a UDP endpoint.

udp

Start your RDP session

Now fire up your remote desktop client and enter the public dns name of the server

rdp