Fixing the dastardly NuGet Error : “Unable to connect to remote server”

Package Managers are awesome … when they work ! Lately I’ve been having a lot of trouble running NuGet package restore commands on my dev machine. Its all my own doing but I thought I’d share the pain points with you and 5 simple steps to to follow whenever you encounter the error that will get you back up and running…

Step 1 –  (seriously) close visual studio ,reboot your machine and try again

Step 2 –  disable any HTTP debugging  proxies on your machine

If you are running a local proxy debugging tool such as  Fiddler or OWASP Zed Attack Proxy you’ve broken NuGet ! You’ll need to turn them off (open fiddler=> fiddler option => connection => uncheck “act as system proxy on startup”)  and ensure that the  environment variable  HTTP_PROXY is removed from your system settings.

Step 3 –  (seriously) close visual studio  again , reboot your machine and try again

Step 4 –  clear the package cache(s) on your machine

NuGet has deep pockets and lots of them !

  • Delete the contents of the  folder

C:\Users\{username}\AppData\Local\NuGet\Cache

  • Delete the contents of the “packages” folder that sits alongside your visual studio .sln file

Step 5 – If you are using a custom package sources delete and recreate it

We run an internal NuGet server in the office, the trouble is I work remotely from the other side of the world so I need to connect over a VPN to access the server, this is way too slow for me so I sync up the contents of a local folder on my pc with the NuGet packages from the office and point to this using a custom package source that points to the folder.

nuget

Occasionally NuGet has trouble dealing with this and I need to delete the Package source and recreate it…

Voilà – you are hopefully back in business !

Don’t get hung up on code coverage statistics

Automated unit tests are a critical asset on every software project. They give you the confidence to constantly refactor and evolve you designs as your code morphs and evolves. Exception handling and adding boundary condition checks is often forgotten about unless you take the time to think it through and writing unit tests gives you the head space to do just this. Then there’s the poor sod that has to maintain your code in the future . The first place I go to understand how some code works is the unit tests.

Code coverage reports and statistics are fraught with danger. Coverage reports should not be used a management tool to judge the overall quality of a solution. All they really tell you with certainty is how much code hasn’t been tested at all. Getting hung up on the code coverage percentage is self defeating. If teams feel pressured into achieving a certain magic number there is a real danger that quantity becomes more important to the quality of the tests and you’ve missed the whole point. The focus should be on writing valuable unit tests that improve the quality and resilience of the overall solution, the code coverage metric is simply a side effect of this process.

A better way to use a code coverage report is to use it as a conversation starter with your team. If one area of the code has low coverage find out why, make sure you understand the functionality that lives there, maybe the code is trivial or it’s better tested with an integration test. As the project iterations unfold expect the total number of unit tests to climb steadily but don’t get hung on the numbers.

Microsoft open sources the .NET compiler platform “Roslyn”

One of the big announcements to come out of this week’s //BUILD conference was that Microsoft have now open sourced a preview of Roslyn – their next generation compiler for C# and VB.NET.

Roslyn exposes the entire code source code parser,  semantic analysis and syntactic tree in high fidelity as an API. This opens the door for language geeks to roll their own code inspection and refactoring tools (like ReSharper) . You can also leverage Roslyn to create brand new programming languages.

The original C# compiler must have been getting pretty unwieldy given its been around since 1998. I know the code I wrote in 1998 is beyond comprehension ! I’m guessing that the primary motivation for rewriting the compiler was an internal one for Microsoft –  a new clean compiler will enable Microsoft to innovate on new language features more quickly and gives them the ability to create new languages more quickly. With this in mind the move to open source Roslyn is hugely significant. From now on the entire community  ( and not just Microsoft ) will reap the benefits of the compiler as a service.

Roslyn has been a long time in the making, the last CTP was in 2012, late last year it was announced that the daily builds of Visual Studio were being compiled using Roslyn. Here’s hoping that the final version makes it to the big time soon.

The .NET Compiler Platform can be found here.

What to expect in HTTP/2.0

Its been almost 15 years since a new version of HTTP was released but the wait is almost over. According to this wikipedia link  HTTP/2.0 may be ready by the end of 2014. The good news is that that all the semantics remain the same, there are no planned changes to HTTP methods and status codes, we are all 200-OK with this (pardon the pun).

The most significant change that is planned is at the transport layer with the introduction of HTTP multiplexing. A new binary framing layer in HTTP will allow the delivery of multiple requests and responses on a single shared connection without blocking each other. Currently you need open multiple connections to a server to enable parallelism or revert to something like websockets. With multiplexing all requests for resources will execute in parallel by default. Multiplexing has the potential to make many of the current performance tweaks in today’s web world obsolete. Will you still need to consider image spriting when multiple images can be requested in parallel at load time ? Will we still need to consider concatenating CSS and JavaScript into a single file to reduce the number of page requests? Maybe the real way to speed up you website will be to tweak the priority assigned of each requested resource rather than worrying about requesting too many resources.

Part of me will definitely be sad to see the end of the text based protocol. I love the idea that under the covers something as powerful as a browser and a web server can figure everything out through a simple text based protocol.

Other things coming in HTTP/2.0 include

  • Cache Pushing – proactively pushing content to the browser ahead of time to speed things up
  • HTTP Header Compression

You can check out a draft of the specification here (warning strong coffee required!)

http://tools.ietf.org/html/draft-ietf-httpbis-http2-10

 

 

 

On being a technical book reviewer

Recently  I worked as a technical reviewer for a new book on Windows Azure Mobile Services –  “Learning Windows Azure Mobile Services for Windows 8 and Windows Phone 8″ by Geoff Webber.

Azure mobile services in technology that I’m pretty familiar with. I delivered a number of talks and demos to user groups in Australia and Ireland on mobile services last year. I’ve never contributed to book before so when packt publishing approached me I decided to give it a try. Overall it was really enjoyable experience and its always nice to give something back to the community.

Here’s the low down on what is expected of you as a tech reviewer.

The publishing deadlines were tight – I had to commit to reviewing a chapter every 2-3 days. I was then emailed each chapter one at a time and asked to read through them and comment on the following area’s

  • Has the author left out any important topics?
  • Is the order of the content logical?
  • Are the code examples correct?
  • What could the author do to make the book more interesting?
  • How would you rate the chapter out of 10?
  • What would you change in order to make it an 8/10 or higher?
  • Has the author explained the concepts clearly enough?

The process gave me a real appreciation for the amount of time and effort that is involved in creating a book. Its a massive commitment – I take my hat off to everyone that has written a book.

For those of you interested in learning more about mobile services the book is now available for purchase online . If you do read the book I’d love to hear what you think of it!

Neo4j GraphGist Competition – AWS Global Infrastructure graph

Looking for a fun side project for the weekend ? Well it’s not too late to enter the Neo4i GraphGist Winter Challenge! you have till the end of the month to submit an entry.

I spent a few hours this weekend modelling the AWS global infrastructure graph – you can check out my graphgist here

I haven’t worked with Neo4j in a long time and it surprised me how quickly it call came back – Neo4j is whiteboard friendly, so grab yourself a marker this weekend and immerse yourself in some graph database concepts !

To submit an entry you’ll need to design your graph, come up with the CYPHER statements to build and query it and compose it all nicely in an asciidoc-graphgist-file

have fun !

graphgist

http://www.neo4j.org/learn/graphgist_challenge

Enabling remote desktop on an Ubuntu VM in Windows Azure

The Ubuntu VM’s available from the gallery on Windows Azure don’t come with a  desktop GUI – you need to work with the server using an ssh client like putty. There is no harm in setting up an alternative way to connect just in case you have a runaway server. Below are the steps involved to set up RDP access.

Install Ubuntu Desktop

First  you need to install the desktop

sudo apt-get install ubuntu-desktop

Install xrdp package

sudo apt-get install xrdp

Enable an RDP endpoint from the azure management portal

On to the management portal you need to enable a UDP endpoint.

udp

Start your RDP session

Now fire up your remote desktop client and enter the public dns name of the server

rdp