Archive

Posts Tagged ‘clr’

.NET Tools: NDepend static analysis tool, leave T-Pain behind.

June 1st, 2009 5 comments

The release of Visual Studio 2008 brought along Code Metrics to the IDE‘s ‘out-of-the-box’ functionality (I’ve been overusing that phrase thanks to our resident CRM Consultant at work!). This was a major boon for .NET developers to get a clear idea of health of what they write, Visual Studio 2005 gave FxCop integration that provided much needed static code analysis for .NET assemblies. Together these tools provide a peek into the deep depths of the project your working on, it benchmarks the correctness, performance and security implications, localisation, design issues amongst other metrics. Damn useful if you’ve inherited – as I often do in my consultancy life – someone else’s code base with little or no documentation. They say code is the best documentation right? (oh gosh, not one of those projects!)

Whilst both the metrics in VSNET and FxCop give you a low level understanding of your code – based on framework guidelines, what if you want more in depth understanding of what your ‘working with’ rather than how you use the .NET framework? How many methods derive from a certain control – remember we’re doing static analysis here, no Resharper loving! Or how maintainable the assemblies are. What are the methods with more than 30 lines of code? (hint: need to refactor!)

This is where NDepend comes in. NDepend is a static code analysis tool on steroids – and I’m not exaggerating here. You will love NDepend long time as I do right now.

Load up NDepend, point to your assemblies, then let NDepend think a little and it will spit out a plethora of information for you to take in.

The NDepend UI – VisualNDepend

There is however one caveat, the first time you use it – and I know this will happen to the majority of users, you’ll probably get overwhelmed with what you’ll be displayed with:

NDepend Paint.NET Analysis

NDepend Paint.NET Analysis

So you can reproduce this with the trial of NDepend, I’m looking at the latest Paint.NET release. But once you get over your initial sense of wonder and disbelief you can start to demystify the UI and the beauty of NDepend. Dont worry, theres plenty of documentation and help to get you on your way, I’ll cover those later ūüôā

First, we have a Class Browser to the left that lists all the assemblies that are being anaylsed – this includes the assemblies you selected (black!) and the assemblies that were added automajically by NDepend as dependencies (blue).

To the right of the class browser is our Metrics visual representation (those black balls actually mean something – Marty!). We can tell NDepend to show us (visually via the Code Metrics display) the top 10,20,50-5000 methods. Double click on any item in the view and it will automatically jump to the source (in the working VS.NET instance if available) for you to inspect further. Theres also deep integration with Visual Studio too – again later!

Underneath the Metrics window we have the Dependency Graph on the left and the Dependency Matrix on the right. This view gives us an idea of the coupling between the assemblies in our list.

The World of CQL

Then we have – what makes me get jiggy wif it, the CQL Query window. CQL is Code Query Language, and its just as your thinking, its SQL for Code. Armed with a basic understanding of CQL you can get some really useful information about your project – infact the report that gets generated by NDepend already contains a bunch of metrics for you and comes with over 85 metrics to begin with in a heavily documented specification – with examples. Writing a simple bit of CQL like the one below, will give you a representation of all public methods that contain more than 30 lines of code.

SELECT METHODS WHERE NbLinesOfCode > 30 AND IsPublic

Neat huh? Thats only an example from the features page, there’s lots more. We can even setup a constraint to notify us when we exceed a threshold.

WARN IF Count > 0 IN SELECT METHODS WHERE NbILInstructions > 200 ORDER BY NbILInstructions DESC

This will warn us when we have methods that exceed 200 IL instructions. You can even combine a bunch of them and workout a metric to benchmark which methods you need to refactor, heres one from the report that gets autogenerated by the VisualNDepend tool:

WARN IF Count > 0 IN SELECT TOP 10 METHODS /*OUT OF "YourGeneratedCode" */ WHERE 

                                           // Metrics' definitions
     (  NbLinesOfCode > 30 OR              // http://www.ndepend.com/Metrics.aspx#NbLinesOfCode
        NbILInstructions > 200 OR          // http://www.ndepend.com/Metrics.aspx#NbILInstructions
        CyclomaticComplexity > 20 OR       // http://www.ndepend.com/Metrics.aspx#CC
        ILCyclomaticComplexity > 50 OR     // http://www.ndepend.com/Metrics.aspx#ILCC
        ILNestingDepth > 4 OR              // http://www.ndepend.com/Metrics.aspx#ILNestingDepth
        NbParameters > 5 OR                // http://www.ndepend.com/Metrics.aspx#NbParameters
        NbVariables > 8 OR                 // http://www.ndepend.com/Metrics.aspx#NbVariables
        NbOverloads > 6 )                  // http://www.ndepend.com/Metrics.aspx#NbOverloads
     AND 

     // Here are some ways to avoid taking account of generated methods.
     !( NameIs "InitializeComponent()" OR
        // NDepend.CQL.GeneratedAttribute is defined in the redistributable assembly $NDependInstallDir$\Lib\NDepend.CQL.dll
        // You can define your own attribute to mark "Generated".
        HasAttribute "OPTIONAL:NDepend.CQL.GeneratedAttribute")

Whats more, because NDepend is language neutral you can query any managed assembly. Theres so much goodness you can get from CQL, most of your needs are already documented in the specifications.

Healthy coder == healthy code right?

NDepend also gives us a representation of what state the code is in with the generated report.

Paint.NET Abstractness vs Stability

Paint.NET Abstractness vs Stability

This metric – based on Robert C Martin’s Abstractness vs Stability paper. To quote the paper’s Abstract directly:

This paper describes a set of metrics that can be used  to measure the quality of an object-oriented design in terms of the interdependence between the subsystems  of  that design.   Designs which are highly interdependent tend to be rigid, unreusable and hard to maintain.
Yet interdependence is necessary if the subsystems of  the design are to collaborate.  Thus, some forms of dependency must be desirable,
and other forms must be undesirable.   This paper proposes a design pattern in which all the dependencies are of the desirable form. Finally, this paper describes a set of  metrics that measure the conformance of a design to the desirable pattern.

In the case of Paint.NET we can see that we’re all over the bottom corner of the image. What does this mean?

First we have the two ends of the scale.

  • Y – Abstractness
    This measures how abstract the assembly is, can it be extended without recompiling? Lots of interfaces and base classes help here.
  • X – Instability
    Measures how much this assembly is utilised by its public interface. For most third party component (from vendors) they’ll fall into the less instability area, so you have to ensure that any changes are properly managed to avoid breaking clients.

Then we have two zones.

  • Zone of uselessness
    This is when an assembly is very abstract and extensible but no-one uses it you’ll find it closer to this area.
  • Zone of Pain
    This is when an assembly is referenced (or have lots of dependants) and is not very extensible – no abstract implementations.

One thing to note though, the words ‘Pain’ and ‘Uselessness’ may be a bit harsh in its wording. If you – like me – have a core ‘framework’ that you write have it locked down and reference it muliple projects then they should indeed fall into the ‘Zone of Pain’ assuming that you have ensured its stability and realise the consequences of breakages later on. Most third party products will fall into here – we’re talking your UI Controls, Sharp components etc.

Ideally you’d want to be hovering in the green area cosey with the line in the middle for your core product.

Would you like Documentation with that?

As mentioned earlier, NDepend comes with lots of help, firstly we have – what I used, the Getting Started screencasts, tutorials, CQL Documentation with *actual* usable examples.

Scott Hanselman has also released a nice cheatsheet for NDepend that will go well hanging next to your PC.

Integrating NDepend to Integration Server

At home (and at work) we use Jetbrains TeamCity, you can easily integrate NDepend into TeamCity by following Laurent Kempé directions.

If you use CruiseControl.NET, you’ll find Robin Curry‘s guide on integrating NDepend to NAnt and CruiseControl.NET useful.

Integration with your favourite tools

NDepend fully integrates with Visual Studio and Reflector.

NDepend Options Integration

NDepend Options Integration

The integration in Reflector – which reflect that of Visual Studio integration.

NDepend Reflector Integration

NDepend Reflector Integration

Gives you one click access to some common metrics.

Conclusion

If you want to get a good understanding of your project – or someone elses, metrics will help you greatly to give you an impression of the health of the project and NDepend will come in quite handy for you. We only _barely_ scratched the surface with this blog post, I’ve spent a good chuck of a week using NDepend and find it ubber useful in my work life – partly because it involves reparing the mess others have left – but it also serves as a good reminder of how you should write code.

References

Fine Print: Full Disclosure

I was offered a license to NDepend by Patrick Smacchia and given the chance to write my thoughts on this product, I was not paid to review this product – feel free to send some moola if you want to though ūüėČ

{lang: 'en-GB'}
Share

Deep Dive: How .NET Regular Expressions really work.

March 17th, 2009 No comments

Have you ever wondered how Regular Expressions really work? Most of us (myself included) just take the implementation for granted, but Jeff Moser of Moserware has posted a most excellent, very in-depth overview how Regular Expressions have been implemented in .NET.

A must read for anyone who would like a deeper knowledge about what really happens under the hood and Jeff has done a brilliant job of pulling it into one consistent article.

Whilst on the subject of RegEx’s, I use Expresso for my regex testing.

{lang: 'en-GB'}
Share

Microsoft Releases Singularity 2.0 Research Development Kit (RDK)

November 18th, 2008 No comments

Microsoft has just unleased the initial release of the Singularity 2.0 Research Development Kit (RDK). Singularity is a research operating system started around 2003 by Microsoft Research to write an OS in managed code. The inner-workings of Singularity taken from Wikipedia:

The lowest-level x86 interrupt dispatch code is written in assembly language and C. Once this code has done its job, it invokes the kernel, whose runtime and garbage collector are written in Sing# (an extension of C#) and runs in unsafe mode. The hardware abstraction layer is written in C++ and runs in safe mode. There is also some C code to handle debugging. The computer’s BIOS is invoked during the 16-bit real mode bootstrap stage; once in 32-bit mode, Singularity never invokes the BIOS again, but invokes device drivers written in Sing#, an extended version of Spec#, itself an extension of C#. During installation, Common Intermediate Language (CIL) opcodes are compiled into x86 opcodes using the Bartok compiler.

This new release brings some funky changes:

  • Support for AMD64 64-bit platforms
  • Updates to the Bartok MSIL-to-native compiler and the Sing# compiler
  • A new, more modern and extensible bootloader
  • Several new applications and application documentation
  • Eventing support
  • More extensive ACPI support
  • A unit testing library
  • A ramdisk device
  • An SMB client service
  • Can now check out the most recent version of the Singularity RDK directly from CodePlex source control

Its released under Microsoft’s shared source academic license which in basically means you can do what you like, just don’t make any money out of our hard work.

For convenience there’s even an ISO already baked ready to slap into a Virtual Machine ūüôā

There are others that deviate from Singularity that tackle the use of a managed operating system slightly differently and I wrote about them a while ago.

{lang: 'en-GB'}
Share

Mono 2.0 Released today!

October 6th, 2008 No comments

Mono has made it to version 2.0 today and brings so much goodness to the table. Some very cool new features and functionality to Mono and promises of speed improvements – which I dont doubt having tried a few things.

From the release notes:

Microsoft Compatible APIs

  • ADO.NET 2.0 API for accessing databases.
  • ASP.NET 2.0 API for developing Web-based applications.
  • Windows.Forms 2.0 API to create desktop applications.
  • System.XML 2.0: An API to manipulate XML documents.
  • System.Core: Provides support for the Language Integrated Query (LINQ).
  • System.Xml.Linq: Provides a LINQ provider for XML.
  • System.Drawing 2.0 API: A portable graphics rendering API.

Mono APIs

  • Gtk# 2.12: A binding to the Gtk+ 2.12 and GNOME libraries for creating desktop applications on Linux, Windows and MacOS X.
  • Mono.Cecil: A library to manipulate ECMA CLI files (the native format used for executables and libraries).
  • Mono.Cairo: A binding to the Cairo Graphics library to produce 2D graphics and render them into a variety of forms (images, windows, postscript and PDF).
  • Mono’s SQLite support: a library to create and consume databases created with SQLite.
  • Mono.Posix: a library to access Linux and Unix specific functionality from your managed application. With both a low-level interface as well as higher level interfaces.

Third Party APIs bundled with Mono

  • Extensive support for databases: PostgresSQL, DB2, Oracle, Sybase, SQL server, SQLite and Firebird.
  • C5 Generics Library: we are bundling the C5 generics collection class library as part of Mono.

Compilers

These compilers are part of the Mono 2.0 release:

  • C# 3.0 compiler implementation, with full support for LINQ.
  • Visual Basic 8 compiler.
  • IL assembler and disassembler and the development toolchain required to create libraries and applications.

Tools

Mono includes profiling tools, the standard development kit tools that are part of the .NET framework

  • Debugger: this is the first release when we support a debugger for managed code.
  • Gendarme: is an extensible rule-based tool to find problems in .NET applications and libraries. Gendarme inspects programs and libraries that contain code in ECMA CIL format (Mono and .NET) and looks for common problems with the code, problems that compiler do not typically check or have not historically checked.
  • Mono Linker: a linker that allows developers to reduce the size of their executables and libraries by removing features from libraries using an XML definition of the desired public API.
  • Mono Tuner: a tool to apply arbitrary user-defined transformations to assemblies. Mono uses this library to produce the Silverlight core libraries from the main system libraries.
  • Mono Documentation Tools: the Mono Documentation framework has been upgraded to support documenting generics and extension methods. The tools can be used to produce online and offline documentation for any any APIs, and are used by the project to document our own APIs.

There are so many goodies in this release if C# 3.0 with LINQ loving doesnt entice you already and the fact that Mono now provides a complete WinForms 2.0 implementation for OS X & Linux.

Whats cooler is the WebBrowser control powered by Gecko that ships with Mono, this would be an ideal drop in replacement for the MSHTML control.

Implementations of Table Layout and Flow Layout Panels and Big Arrays.

Go ahead and download a copy and give it a whirl. My how Mono has come over the years.

{lang: 'en-GB'}
Share

Managed Operating Systems & COSMOS – C# Open Source Managed Operating System

September 21st, 2008 2 comments

Writing an operating system in Managed Code is not entirely a new concept but its quite an interesting one. The fact that we have AOT compilers gives us this ability to write such things. This post is a little guided tour or information dump on COSMOS as I worked through the initial bits this weekend.

Background Information

Unlike a JIT compiler – where the initial source (say C# or Java) gets translated into an IL (like MSIL in .NET or Bytecode in Java) which then gets to native code when run (via the .NET CLR or the Java VM), an AOT compiler transforms the source directly to native code – which implies its compiled for a specific architecture and feature-set (eg. x86 binary). Currently there are a couple (in .NET land) to choose from – SharpOS AOT and the IL2CPU project written by the Cosmos guys.

This ensures that the OS can be written entirely from managed code, unlike other attempts like JNode, JavaOS (both of which are Java based and include some ASM & C routines for the initial boot) and the Microsoft Singularity project – which uses some Assembler & C (for the interupt dispatcher) and C++ code to get things moving.

Pweety Screenshots

Side by side pretty pictures of some Managed OSs:

COSMOS

This weekend I took a bit of a look-c of COSMOS, which differs greatly from Singularity. The COSMOS compiler – called IL2CPU, written in C# – converts all the IL code generated to assembler (not to be confused with a .NET Assembly!), thereafter the assembler files are processed by NASM which generates compliant native x86 code. Eventually though, the COSMOS guys hope to generate native directly without the need for NASM. This process is quite streamlined and if you download the COSMOS User Kit you can get COSMOS + booted up and running in minutes! Its way coool!

The User Kit page has all the goss on getting it setup, I tried out Milestone 2, but some helpful¬† hints…

  • Dont install to the default Program Files folder (especially on Vista!) put it into a non-Windows oriented folder.
  • After installing and integrating into VS.NET, use the QEmu option to try it out – VMWare resources arent distributed it seems, as QEmu is already shipped theres nothing more to do.

After you have it installed, load VS.NET and create a new ‘COSMOS Boot’ project. The default template is shown below:

using System;
using Cosmos.Build.Windows;

namespace CosmosHelloWorld
{
class Program
{
#region Cosmos Builder logic
// Most users wont touch this. This will call the Cosmos Build tool
[STAThread]
static void Main(string[] args)
{
var xBuilder = new Builder();
xBuilder.Build();
}
#endregion

// Main entry point of the kernel
public static void Init()
{
Cosmos.Kernel.Boot.Default();
Console.WriteLine("Welcome! You just booted C# code. Please edit Program.cs to fit your needs.");
while (true)
;
}
}
}

Essentially, this boots the COSMOS kernel and displays “Welcome! You just booted C# code. Please edit Program.cs to fit your needs.”, quite simple. Run it and you’ll get the COSMOS Build Options window to help you deploy it – for simplicity select QEMU, hit build and watch the magic of the IL2CPU and other tools come together and build your OS and run QEMU. The output should be something like this (with differing paths ofcourse!):

BuildPath = ‘D:\R&D\Cosmos User Kit\’
ToolsPath = ‘D:\R&D\Cosmos User Kit\Tools\’
ISOPath = ‘D:\R&D\Cosmos User Kit\ISO\’
PXEPath = ‘D:\R&D\Cosmos User Kit\PXE\’
AsmPath = ‘D:\R&D\Cosmos User Kit\Tools\asm\’
VMWarePath = ‘D:\R&D\Cosmos User Kit\VMWare\’
VPCPath = ‘D:\R&D\Cosmos User Kit\VPC\’
Now compiling
Initializing IL2CPU… This may take a minute so please wait for further status…

Recognized Plug methods:

System_Boolean__System_Array_TrySZBinarySearch_System_Array__System_Int32__System_Int32__System_Object__System_Int32__

IL2CPU Run took 00:00:05.3281467
Please wait…executing D:\R&D\Cosmos User Kit\Tools\nasm\nasm.exe…
Please wait…executing D:\R&D\Cosmos User Kit\Tools\cygwin\ld.exe…
Now creating ISO
Try removing ‘D:\R&D\Cosmos User Kit\cosmos.iso’
Try removing ‘D:\R&D\Cosmos User Kit\ISO\output.bin’
Try copying ‘D:\R&D\Cosmos User Kit\output.bin’ to ‘D:\R&D\Cosmos User Kit\ISO\’
Running mkisofs
Please wait…executing D:\R&D\Cosmos User Kit\Tools\mkisofs.exe…
Please wait…executing D:\R&D\Cosmos User Kit\Tools\qemu\qemu.exe…
Press enter to continue.

The Build agent runs IL2CPU which outputs the ASM, which then goes through to NASM who hands it over to GNU Linker. Then we bake an ISO which gets booted by QEMU. Couldn’t be easier ūüôā

A man can dream Oh yes a man can dream

There are some incredibly exciting ideas are floating around about how to make the most of COSMOS and what can be fully realised on the Scenarios Page and an interview at Obsethryl Labs on COSMOS and another on SharpOS which is interesting reading.

Next time I’ll start poking around some more and see where it gets me.

{lang: 'en-GB'}
Share

.NET Framework 3.5 SP1 CLR Improvements

August 20th, 2008 No comments

Kevin Frie, the lead developer for core bits of the CLR just posted some information about the changes in .NET CLR 3.5 SP1. Heres an excerpt:

NGen infrastructure rewrite: the new infrastructure uses less memory, produces less fragmented NGen images with much better locality, and does so in dramatically less time.  What this means to you:  Installing or servicing an NGen image is much faster, and cold startup time of your NGen’ed code is better.

Framework Startup Performance Improvements: The framework is now better optimized for startup.  We’ve tweaked the framework to consider more scenarios for startup, and now layout both code & data in the framework’s NGen images more optimally.  What this means to you:  Even your JIT code starts faster!

Better OS citizenship: We’ve modified NGen to produce images that are ASLR capable, in an effort to decrease potential security attack surface area.  We’ve also started generating stacks that are always walkable using EBP-chaining for x86.  What this means to you:  Stack traces are more consistent, and NGen images aren’t as easily used to attack the system.

Better 32-bit code quality: The x86 JIT has dramatically improved inlining heuristics that result in generally better code quality, and, in particular, much lower ‚Äúcost of abstraction‚ÄĚ.¬† If you want to author a data type that only manipulates a single integer, you can wrap the thing in a struct, and expect similar performance to code that explicitly uses an integer.¬† There have also been some improvements to the ‚Äėassertion propagation‚Äô portion of the JIT, which means better null/range check elimination, as well as better constant propagation, and slight better ‚Äėsmarts‚Äô in the JIT optimizer, overall.¬† What this means to you:¬† Your managed code should run slightly faster (and sometimes dramatically faster!).¬† Note to 64 bit junkies: ¬†We‚Äôre working on getting x64 there, too.¬† The work just wasn‚Äôt quite there in time.

Whats interesting to note is that the CLR Optimisations for inlining will finally be coming to the 64bit CLR, just hope that it comes sometime sooner rather than later.

In the meantime, grab the .NET Framework 3.5 SP1.

{lang: 'en-GB'}
Share