Month: November 2006

How would I spend my $100?

Joe Duffy is busy at work writing a book about concurrency in .NET. He asks how we would spend our $100, given a choice from a set of topics available. Well, I’d have to spend mine like this:

$5 – (0) Parallel algorithms (Comp Sci., technology agnostic)
$5 – (1) Architecting large scale concurrent programs
$5 – (2) Windows concurrency internals
$5 – (3) CLR concurrency internals
$5 – (4) Windows (Win32) concurrency API best practices
$5 – (5) CLR concurrency API best practices
$5 – (6) Client-side concurrency
$5 – (7) Server-side concurrency
$5 – (8) Reusable concurrency primitives (building and using them)
$5 – (9) Performance and scalability
$50 – (A) Debugging

And here’s why. Because debugging concurrent systems is Ruddy Hard!!!!! I wish I had $200 J

2c/

Automated parallelization of Sequential Systems

I recently came across the Dryad project at MS Research. It is concerned with the development of automated systems to deploy sequential systems across parallel platforms. This is an interesting and topical issue at the moment because the advent of multi-core processors poses problems for developers. In C#, at the very least, there is no inherent parallelization of the language. Developers have to explicitly code a system with threading or grid computing in mind. Coding in such a way is generally pretty hard, and fraught with peril to the unwary programmer, so any system that can ease the work required is likely to be finding its way into the .NET platform in the near future.

My impression from Dryad is that to make use of the multi-core processors we may have to change the way we design systems to allow automated parallelization. The Dryad project talks about a multileveled description of a system that incorporates declarations of the dataflow of the system as well as primitive operations to be performed on them. It seems to be targeted more at the massively computationally intensive operations traditionally performed in giant vector processors. What does it mean to those of us who develop distributed applications on web farms? I’m not sure, but I think that the trend will be the hot topic of the next few years.

At Tech Ed Sydney 2006, this issue was highlighted by Joel Pobar – he pointed out that the declarative elements of technologies like LINQ and F# provide a window of opportunity to allow the runtime to incorporate structures that will make the life of systems like Dryad easier. An interesting trend is the gradual productisation of automated analysis and proof systems. The coolest thing I’ve seen lately is the Spec# project, again from MS Research which augments the C# IDE with various kinds of static analysis and automated theorem provers. For a while, I was a researcher on the RAPIER at the Parallel Application Centre at the University Of Southampton, UK. The project sought to integrate design, development, simulation and theorem proof tools for the development of real-time systems. I can’t wait till such tools become the conventional static error checkers of IDEs like Visual Studio. It is discouraging that such tools are still not in the mainstream after 10 years of progress during which time software systems have become exponentially more complex.

No indestructible Lucite Shipit Awards any More?

I have to admit that my impression of what life must be like inside of Microsoft stems mostly from reading Microserfs by Douglas Coupland. As a result, I was disappointed to discover that when stuff gets shipped these days, they give each other cakes instead of indestructible Lucite plaques.

Here’s the proof:

My guess is they probably didn’t tie it to the back of a car and drive around the block either.

Boo.com to reincorporate: one for you Aggy

It appears that Boo has reincarnated to drag us down, once again. Of course, it won’t be possible without the help of Aggy Finn who was with them at the time*.

This is how he escaped last time:

If the contractor wages go down to 10GBP/hr again, Aggy, I shall be blaming you!!!!!

* Needless to say, most this is a pack of lies. Especially the bit about Aggy being responsible.

Tax on British Stupidity gets Hiked

According to this BBC report, the UK economy is losing billions to those Nigerian scams. Even though I don’t need to explain which scams to those of you who read this – they are proverbial – there are unspeakably avaricious and gullible cretins out there who still send their money into the Nigerian black economy.

I suppose if these people are moronic enough to think that the son/daughter of the Nigerian Mister for transport/embezzlement/finance  would single them out from the billions of other candidates to share their booty with, then they deserve what they get.

Besides, it’s more sociable than playing on slot machines, or having your mobile stolen – you get to exchange emails with these nice but desperate scions of once great African families.

Excellent Article on the Concepts behind C# 3.0

Thomas Petricek has written a very interesting article on the new concepts behind C# 3.0 (here). It shows the origin of many of the functional programming features found in C# 3.0 from CĪ‰ and F#. Having explored a little of the code that backs up the functional programming aspects, I understand that although the extension run with the basic features of C# 2.0, there is a huge amount of C# code required to deliver the functional paradigm to C#. Most of that code provides complex code generation, type inference and declarative programming support.

In the first section on first class function support – I found on closer inspection (within LINQ at least) that these first class functions, are actually delivered through calls to the DynamicMethod method of System.Reflection.Emit. If you disassemble its code, you’ll see that the relationship between the imperative and functional programming in C# is through ‘runtime support‘. The functional programming extensions are a runtime extension to the CLR that generates code to fulfill declarative requirements. That is – there’s no new radical paradigm at work in the core code, but the way it’s exposed will simplify things so much that it might as well be called a new paradigm.

Well worth a read.