One of the big announcements to come out of this week’s //BUILD conference was that Microsoft have now open sourced a preview of Roslyn – their next generation compiler for C# and VB.NET.
Roslyn exposes the entire code source code parser, semantic analysis and syntactic tree in high fidelity as an API. This opens the door for language geeks to roll their own code inspection and refactoring tools (like ReSharper) . You can also leverage Roslyn to create brand new programming languages.
The original C# compiler must have been getting pretty unwieldy given its been around since 1998. I know the code I wrote in 1998 is beyond comprehension ! I’m guessing that the primary motivation for rewriting the compiler was an internal one for Microsoft - a new clean compiler will enable Microsoft to innovate on new language features more quickly and gives them the ability to create new languages more quickly. With this in mind the move to open source Roslyn is hugely significant. From now on the entire community ( and not just Microsoft ) will reap the benefits of the compiler as a service.
Roslyn has been a long time in the making, the last CTP was in 2012, late last year it was announced that the daily builds of Visual Studio were being compiled using Roslyn. Here’s hoping that the final version makes it to the big time soon.
The .NET Compiler Platform can be found here.
Its been almost 15 years since a new version of HTTP was released but the wait is almost over. According to this wikipedia link HTTP/2.0 may be ready by the end of 2014. The good news is that that all the semantics remain the same, there are no planned changes to HTTP methods and status codes, we are all 200-OK with this (pardon the pun).
Part of me will definitely be sad to see the end of the text based protocol. I love the idea that under the covers something as powerful as a browser and a web server can figure everything out through a simple text based protocol.
Other things coming in HTTP/2.0 include
- Cache Pushing – proactively pushing content to the browser ahead of time to speed things up
- HTTP Header Compression
You can check out a draft of the specification here (warning strong coffee required!)
Recently I worked as a technical reviewer for a new book on Windows Azure Mobile Services – “Learning Windows Azure Mobile Services for Windows 8 and Windows Phone 8″ by Geoff Webber.
Azure mobile services in technology that I’m pretty familiar with. I delivered a number of talks and demos to user groups in Australia and Ireland on mobile services last year. I’ve never contributed to book before so when packt publishing approached me I decided to give it a try. Overall it was really enjoyable experience and its always nice to give something back to the community.
Here’s the low down on what is expected of you as a tech reviewer.
The publishing deadlines were tight – I had to commit to reviewing a chapter every 2-3 days. I was then emailed each chapter one at a time and asked to read through them and comment on the following area’s
- Has the author left out any important topics?
- Is the order of the content logical?
- Are the code examples correct?
- What could the author do to make the book more interesting?
- How would you rate the chapter out of 10?
- What would you change in order to make it an 8/10 or higher?
- Has the author explained the concepts clearly enough?
The process gave me a real appreciation for the amount of time and effort that is involved in creating a book. Its a massive commitment – I take my hat off to everyone that has written a book.
For those of you interested in learning more about mobile services the book is now available for purchase online . If you do read the book I’d love to hear what you think of it!
Looking for a fun side project for the weekend ? Well it’s not too late to enter the Neo4i GraphGist Winter Challenge! you have till the end of the month to submit an entry.
I spent a few hours this weekend modelling the AWS global infrastructure graph – you can check out my graphgist here
I haven’t worked with Neo4j in a long time and it surprised me how quickly it call came back – Neo4j is whiteboard friendly, so grab yourself a marker this weekend and immerse yourself in some graph database concepts !
To submit an entry you’ll need to design your graph, come up with the CYPHER statements to build and query it and compose it all nicely in an asciidoc-graphgist-file
have fun !
The Ubuntu VM’s available from the gallery on Windows Azure don’t come with a desktop GUI – you need to work with the server using an ssh client like putty. There is no harm in setting up an alternative way to connect just in case you have a runaway server. Below are the steps involved to set up RDP access.
Install Ubuntu Desktop
First you need to install the desktop
sudo apt-get install ubuntu-desktop
Install xrdp package
sudo apt-get install xrdp
Enable an RDP endpoint from the azure management portal
On to the management portal you need to enable a UDP endpoint.
Start your RDP session
Now fire up your remote desktop client and enter the public dns name of the server
I’ve created this interactive visualization of New Years Eve 2014 as it unfolded on Twitter using a bunch of open source tools running on Windows Azure.
Recently, I’ve been spending some working with Big Data and Hadoop Distributions and I was trying to come up with a “useful” side project to play around with the technology, what bigger event is there on twitter than the annual #happynewyear tweets as they fly around the world at the dawn of 2014?
I connected to twitters streaming API using a simple node.js client. The open source node package appropriately named Twit by Tolga Tezel does all the heavy lifting for me in a few lines of code. I aggregated over 6 million tweets in 24 hours – averaging 60 tweets per second. According to twitters documentation, the streaming API will give you access to 1% of the twitter firehose at any one time and judging by the geographic spread of the tweets I suspect that it is sympathetic to where in the world you connect from, I was running out of the windows azure data center in Dublin.
Processing the data
Now on to the data crunching, I uploaded all the tweets in multiple 20MB text files to Windows Azure Blob Storage and spun up an 8 node HDInsight Hadoop Cluster to process the data. Storing the tweets naively in blob storage gave me the flexibility to only spin up the cluster for a couple of minutes. I aggregated all the tweets that had a place associated with them and extracted the latitude and longitude coordinates.
Visualizing the results
I used Chrome’s open source Web GL Globe platform to showcase the results in an interactive 360 degree visualization of the data. You’ll need to be running Web GL enabled browser when you connect to the website.
Open Source tools – power to the people
This experiment cost me absolutely zip to conduct ! All the code and technologies I used were open source – node.js, Hadoop and Web GL Globe. The cloud compute time also came free of charge thanks to my MSDN subscription.
The source code is available here, may the source be with you and #happynewyear
So it turns out that Windows Azure Websites won’t serve up static .json files by default, no matter how many times you hit F5 !
If you try to access a file you’ll see a 404 File Not Found Error, because the MIME Type of .json is not set by default. This also applies for other “non-standard” file that need a specificMIME Type such as mp3, csv etc.
To fix this issue you need to set the correct MIME types in your web.config file and redeploy – easy when you know how!
<mimeMap fileExtension=".json" mimeType="application/json" />
More information on configuring MIME types in IIS here..