Sharing a folder between Windows VMWare workstation host & Ubuntu guest

So I lost a few hours of my life today trying to set up a folder share between a Windows 7 host machine running VMWare workstation 11.0 and an Ubuntu guest machine.
I went through the following steps  to link a new created shared windows folder .

After restarting the VM I ran the default wmware-config tools, accepting all the defaults

$sudo vmware-config-tools.pl

and ended up with the following compile time error from the vmhgfs module

“The filesystem driver (vmhgfs module) is used only for the shared folder
feature. The rest of the software provided by VMware Tools is designed to work
independently of this feature.

If you wish to have the shared folders feature, you can install the driver by
running vmware-config-tools.pl again after making sure that gcc, binutils, make
and the kernel sources for your running kernel are installed on your machine.
These packages are available on your distribution’s installation CD.
[ Press Enter key to continue ] “

After a lot hunting around I finally discovered that this was a known problem with older versions VMWares tooling , not sure how I managed  to recreate it but for what its worth here are the steps to fix the problem

(1)Ensure that latest Workstation 11 is installed.
(2)Uninstall open-vm-tools

$sudo apt-get remove open-vm-tools

(3) Check for any missing package updates

$sudo apt-get update

(4) Grab the following source

$sudo git clone https://github.com/rasa/vmware-tools-patches.git

(5) Untar & Patch !

$cd vmware-tools-patches
$sudo ./download-tools.sh
$sudo ./compile.sh

your shared folder should appear under the following directory

mnt/hgfs/

Chrome DevTools – console.table() function

It’s the simple things in life that can really make you happy… like the chrome developer tools console.table() function. It was released in 2013 but I only learned about its awesomeness today!

Logging to the console the old way

way, way back in the day (i.e. yesterday) I was logging objects straight to the console

var nirvana = [
{ name: "Kurt Cobain", instrument: "guitar" },
{ name: "Krist Novoselic", instrument: "bass" },
 { name: "Dave Grohl", instrument: "drums" }
];

console.log(nirvana);

and expanding out tree views to see what was going on…

old skool

Logging to the console with console.tree()

Enter console.table()

var nirvana = [
{ name: "Kurt Cobain", instrument: "guitar" },
{ name: "Krist Novoselic", instrument: "bass" },
 { name: "Dave Grohl", instrument: "drums" }
];

console.table(nirvana);

A nice formatted table with sortable  columns.

new skool1

 

Thank you Chrome DevTools for caring !

Book Review – “Building Microservices” By Sam Newman

Designing Fine-Grained Systems

Microservices are currently a very hot topic in the software development. By choosing to build finely grained services, that are deployed independently and modeled around very specific business domains  you avoid a lot of the problems of traditional service-oriented architectures. However, there is no such thing as a free lunch when it comes to software development. If you decide to adopt a microservices architecture you need to invest heavily in automation and your DevOps capabilities.

Sam Newman does an excellent job in this book in explaining the key concepts of microservices. He tackles some really difficult topics like – service versioning, how to secure service endpoints, testing approaches , how small should you build your services and how to incrementally move away from a monolithic architecture.The book does a fantastic job of highlighting common pitfalls and areas of concern for when you *are* building such a system.

I’d highly recommend this book if you are currently building out microservices or thinking about going down this path in the near future. I’ve been working on a fairly large microservices implementation for the past 6 months and I wish this book had been around when we started on our journey.

http://shop.oreilly.com/product/0636920033158.do

 

 

 

Tasting some AWS Lambda Curry !

Lambda Curry!
Serve up your own event-driven recipes from Amazon's Cloud Cooker!

In this post I’ll  share my thoughts on AWS Lambda – a new offering from Amazon Web Services that is currently in technical preview.  I’ll  take you through the good and bad parts and explain why so there is so much excitement and hype in the community about this new offering.

I’ve created a small open source project on github to demonstrate how you can use AWS Lambda to synchronize files between an S3 bucket and a personal dropbox account. I’m sure there will be a whole cookbook of recipes appearing in community once the service goes to general release.

Lambda is currently in a closed technical preview, I’ve been fortunate enough to get early access to Lambda and I’ve spent the past few days writing and deploying some Lambda functions to the EU Ireland region.

Announcing_Lambda_reInvent2014_lg

(above) AWS CTO Werner Vogels announces AWS Lambda @ AWS reinvent 

So what is AWS Lambda?

AWS Lamdba is a new service  that runs your code (node.js functions) in response to events that happen in the AWS cloud. You only pay for the compute time required to run your code and billing is metered in increments of 100 milliseconds making it a very economic offering. This is a pure platform as a service play from Amazon , you don’t need to spin up and manage any infrastructure on AWS to run your lambda functions.To paraphrase the great Nigel Tufnel from Spinal Tap if platform as a service is a volume dial from one to ten then AWS Lambda goes to eleven!

Spinal_Tap_-_Up_to_Eleven

(above) The volume knobs of Tufnel’s Marshall amplifier went up to eleven 

Event-Driven Programming is a paradigm which has been around for many years now, in which the flow of the program is determined by external events.  Most desktop applications are event drive – the button click or key press event triggers a specific behavior in the program. Database triggers are another great example of event driven programming. The event of a row being updated in a database table can trigger the execution of custom code.AWS Lambda exposes an event driven programming model to the AWS cloud. For now the number of events on offer is quite small but this will grow over time. Currently you can tap into events such as

  • A new object was created in an S3 bucket ( PUT / POST / COPY operations)
  • DynamoDB table change tracking – any changes made to the table are tracked on a rolling 24-hour basis. You can retrieve this stream of update records with a single API call.
  • New messages arriving in an AWS Kinesis stream

You can also expose your own custom events and call these from as you like from you own application code.

Typically lambda functions will respond to a trigger within milliseconds of the event occurring and from my experience to date the response times have been in the teens.

So what’s the big deal with Lambda ?

Well prior to this service offering if you wanted to respond to an event like syncronising an S3 object with a dropbox folder you’d need to build something like this:

  • When you add a new file to an S3 bucket you’d then place a message on a dedicated SQS message queue.
  • You’d deploy a fleet of EC2 workers to poll the queue and process the messages
  • Finally you’d tweak the auto scale settings to ensure the workers can scale accordingly

With Lambda all this goes out the window, your lambda functions execute in a fully managed, shared compute environment.

How much does it cost?

Amazon has priced the service based on the number of Lambda function requests served and the compute time needed to run your code. The basic formula is that you pay $0.20 per 1 million requests, plus $0.00001667 per GB-second of compute time used, although the first million requests per month are free.There’s also a “free tier” that includes 1 million requests per month and 400,000GB-seconds of compute. Emm, simple eh ? well just in case you are confused Amazon have provided a few uses cases here to give you a feel for the prices

 

Programming Model

The entry point for your lambda function expects 2 parameters.

  • The event parameter contains all the data relating to the event, in this case its an an object being created in an S3 bucket.
  • The context parameter is used to notify lambda that your function has completed successfully.
exports.handler = function (event, context) {

    var srcBucket = event.Records[0].s3.bucket.name;
    var srcKey    = event.Records[0].s3.object.key;

    getToken().then(function (token) {
             getFile(srcBucket,srcKey)
            .then(function (fileStream) {
                  // sync  file
                  context.done(null, "file sync complete");
            })
            .catch(function (error) {
                console.log(error);
            })
    }).done();
};

 Anatomy of a Deployment Package

A deployment package consists of a zip folder containing your node code and any node module dependencies. As with any AWS resource you have the usual  fine grained access controls via  IAM roles and permissions.

Deploying your Code

You can upload you deployment package straight from the AWS console or use the AWS CLI. If you are doing any serious development work check out this grunt plug-in that lets you build and deploy your packages through the grunt task runner.
Author disclosure: I am a contributor to the grunt-aws-lambda project on github.

 

Monitoring and Health checks

Out of the box lambda comes with cloudwatch metrics including request duration, request count and execution error counts. It would be great to see the error handling functionality grow to include things like retry logic and different error notification options .
cloudwatch

What I would like to see

  • Many more events exposed, this will surely come with time
  • Time based triggering of custom events
  • Reserved Instances for executing your lambda functions. This will satisfy customers that are worried about their code executing in a multi-tenanted environment and guarantee consistent performance and response times.
  • Support for more languages. Don’t get me wrong, I’m a big fan of node.js but AWS Lambda should not be limited to JavaScript only.
  • A great eco-system to share AWS recipes with other AWS customers.
  • Support for NPM at run time , currently you need to zip up and deploy all the node modules that you lambda function relies on
  • Pre-canned retry and error handling patterns to choose from
  • Lambda is screaming for some great IFTTT integrations !
  • Some nice DevOps integration to enable AWS infrastructure deployments from a valilla S3 bucket upload

Lambda is still in technical preview, if you’d like to get your hands dirty and have a play head on over to the AWS Lambda Preview Site and register your details…

http://aws.amazon.com/lambda/preview/

 

Useful resources

Video – AWS Lambda Announcement at Reinvent

Github S3 to Dropbox file synchronization

AWS Official Documentation

AWS Lambda Walkthrough Command Line Companion

Lambdash – AWS Lambda Shell

Grunt AWS Lambda task runner

 Footnotes

 

 

 

 

I’m the first Irishman to contribute to ASP.NET !

So I’m pretty confident that I’m the first Irish man to contribute to the next version of Microsoft’s ASP.NET.

Its a pretty funny story, I attended NDC London last month and David Fowler was on stage demonstrating some of the cool features of ASP.NET vNext when he noticed a small bug. I fixed it that night, submitted a pull request and its been accepted into the main build.

It’s really great to see Microsoft embracing open source and the community.

ASPNET vNext is a complete rewrite from the ground up and will run on both Linux & Windows. Its a big move in the right direction from Microsoft. Gone are the days of vendor lock, you no longer need Windows to run ASP.NET , you no longer need a Microsoft IDE  to write your code and there are already bunch of open source plug-ins coming for Sublime and you favorite text editors. Its heavily inspired from frameworks like Sinatra / Express – out of the box its a minimalistic web hosting framework, if you want more bells and whistles you need to pull them in as extra packages. Its still a work in progress but hopefully we’ll see ASP.NET vNext in the market in 2015.

Anyway here’s my pull request in all its glory, open at Microsoft is definitely open for business !

https://github.com/aspnet/Hosting/pull/125

NDC London 2014 Highlights

Last week I traveled to London to attend the NDC 2014 developers conference. It was an excellent conference – great speakers, really friendly crowd, well organised and run. I’d highly highly recommend it to any software developer. All the sessions were recorded and I’d expect them to appear on vimeo shortly, I already have access to the recordings by registering as a conference delegate.

Here’s a rundown of my favorite talks & highlights from the conference in no particular order

“Reactive Game Development For The Discerning Hipster” – Bodil Stokke

This was a real breath of fresh air ! Bodil took to her keyboard and built out a working game from scratch using the JavaScript RxJS reactive extensions library.There were ponies jumping around on the screen, avoiding obstacles and catching magic coins. Well done, it was real brave to get up there and code live on stage. We need more live coding at tech conferences. She showed how easy it is to compose a application using the  asynchronous reactive library without a single callback in sight. Flying ponies, live coding & reactive extension woot !!

2014-12-03 12.20.37

“ASM.js, SIMD, and JS as a compiled-language virtual machine” – Brendan Eich.

Firstly, wow I can’t believe I came face to face with the creator of JavaScript! Brendan took us through a brief history of the language  from the early days back at Netscape all the way through the present day with the ECMA 6 language enhancements , ahead of time compilation engines and then beyond to ASM.js. It’s a subset of JavaScript  which provides a model closer to C/C++ by eliminating dynamic type guards, boxed values, and garbage collection. The code can be compiled ahead of time and stored in offline storage giving you fast start up times with very good performance characteristics. Brendan shared preliminary benchmarks of C programs compiled to ASM.js that are within a factor of 2 slowdown over native compilation with Clang. Game developers like Unity are working with ASM.js as a way to get their games running on the Web without plug-ins. This would also opens the door to many new types of games and mash-ups with everything running in the browser.

Check out Brendan Eich playing a rewrite of Doom with a mash-up  inside it where another port of Doom is running in an iFrame. A game within a game – confused?, I know I was !2014-12-05 12.36.09

“Practical Considerations for Microservices” – Sam Newman

This talk was very close to my heart – I’ve spent the last 12 months working with a large team on a successful project built from the ground up using microservices. Sam did a great job explaining what microservices are and the pitfalls to avoid when you adopt this style of architecture. Great common sense stuff here, it all resonated with me and  its great to see microservices becoming more mainstream. Sam talked about what you should standardise across a project – make sure you use consistent message exchange patterns , monitoring and deployment approaches but don’t get hung up upon how the microservices are built internally. I really enjoyed this session.

“Five (or so) Essential Things to know about ASP.NET vNext “- David Fowler and Damian Edwards

Damian Edwards and David Fowler demonstrated ASP.NET vNext with some fun code samples and slides that explained the brand new stack. The key word here is “new”. Shiny shiny new !.It looks like a rewrite from the ground up to support Windows and Linux for the first time. The photo below shows the Windows components in blue and the new linux stack in orange.

2014-12-04 16.27.25

The guys shared a lot of information in 60 minutes. Web.config files are now gone ! In future you’ll be working with project.json files to store all your project dependencies. This is going to make it much easier to author .NET apps outside of Visual Studio. The guys demonstrated the cross platform support  writing the code once and running it on a Windows and Linux VM  – this got a great round of applause from the crowd !

Next it was onto the dynamic recompiling – Any changes to a dynamically compiled file will automatically invalidate the file’s cached compiled assembly and trigger a recompilation. The guys changed some C# controller code and the changes appeared with a browser refresh. This is a great step forward, in theory you can now eliminate the whole compile step from your deployment process. This is all possible because ASP.NET is now leveraging the Roslyn Compiler as a Service.

It looks like there will be an awful lot of breaking changes with the new version. All the web server middle-ware has been cleanly separated out into separate NuGet packages. I’m guessing the middle ware interfaces are very close to the OWIN Interfaces.  When you create a new  ASP.NET project everything is turned off by default. You need to enable and pull down the middleware packages you need for your app. This is a really good thing. Your web application will be much more lightweight wthout any any additional bloat. The demo’s reminded me a lot of the minimalist node.js Express Framework. Oh yeah and WebAPI is now part of the same codebase !

“Lessons From Large AngularJS Projects” – Scott Allen

Scott Allen delivered a excellent talk on patterns and approaches to consider on your next AngularJS project.  Things like error handling – how to set up an error handling service to handle all unhandled errors rather than relying on scope.emit and the evil $rootscope variable. He demonstrated some clean code to manage security tokens using http interceptors and decorators , I’ll definitely be digging into this one further. Finally he covered the $httpbackend  mocking service  that lets you program expectations for external http calls without having to go over the wire in an automated test.

“Kicking the complexity habit” – Dan North

Dan North gave a really funny talk on how we should all be avoiding unnecessary complexity at every point in the SDLC. Without rigorous care and attention software quickly becomes messy and unmanageable. Even with the best intentions entropy and complexity are a fact of life in growing applications.  From your IDE to your automated build, from DDD’s ACLs to TDD and other TLAs, from backlogs to burn-ups, we are surrounded by props for coping with complexity. As appealing as these are, they also make us less likely to address the underlying problem of complexity itself. Dan believes you can learn to recognise these coping mechanisms for what they are, and intends to set you on the path to simplicating your programming life. Great talk and very thought provoking.

Fixing the dastardly NuGet Error : “Unable to connect to remote server”

Package Managers are awesome … when they work ! Lately I’ve been having a lot of trouble running NuGet package restore commands on my dev machine. Its all my own doing but I thought I’d share the pain points with you and 5 simple steps to to follow whenever you encounter the error that will get you back up and running…

Step 1 –  (seriously) close visual studio ,reboot your machine and try again

Step 2 –  disable any HTTP debugging  proxies on your machine

If you are running a local proxy debugging tool such as  Fiddler or OWASP Zed Attack Proxy you’ve broken NuGet ! You’ll need to turn them off (open fiddler=> fiddler option => connection => uncheck “act as system proxy on startup”)  and ensure that the  environment variable  HTTP_PROXY is removed from your system settings.

Step 3 –  (seriously) close visual studio  again , reboot your machine and try again

Step 4 –  clear the package cache(s) on your machine

NuGet has deep pockets and lots of them !

  • Delete the contents of the  folder

C:\Users\{username}\AppData\Local\NuGet\Cache

  • Delete the contents of the “packages” folder that sits alongside your visual studio .sln file

Step 5 – If you are using a custom package sources delete and recreate it

We run an internal NuGet server in the office, the trouble is I work remotely from the other side of the world so I need to connect over a VPN to access the server, this is way too slow for me so I sync up the contents of a local folder on my pc with the NuGet packages from the office and point to this using a custom package source that points to the folder.

nuget

Occasionally NuGet has trouble dealing with this and I need to delete the Package source and recreate it…

Voilà – you are hopefully back in business !