Free at last…

Already T**stra is beginning to fade into an unhappy recollection. Considering that I congratulated myself on my good judgment in getting myself a place in an Australian household name on my CV, I’m feeling a little (a lot) foolish now. I have worked for places with as pathological environment as Telstra before, but very seldom, and my job-hunting process has sensitized me now to the cultures of the corporations I’m approaching. I guess it is a tautology that in a large corporation, anywhere in the world, the effectiveness of individuals varies inversely in proportion to the size of the company. And lets face it, Telstra is the largest employer in the whole of Australia. Anyway, if you want to do architect level work in Australia, you pretty much have to give up any ideas about going near a compiler. I can’t see how I could be an effective architect without first-hand understanding of the technologies I’m using, and how else to get those than to use them?

So after four months of what should have been a smooth transition back into work, I have come out the other end with my sanity barely intact. Not a single thing I did there was used, and some of it was probably worth my time. I shouldn’t be surprised since it seems that understanding of, or experience in, the software lifecycle is not a necessary pre-requisite of participation in the management of a software project (here). I have even been told that it is an actual impediment to you getting a job on a project team.

This miasmic feeling of despondency is deepened by my reading matter at the moment. I’m reading “Red Rabbit” by Tom Clancy, which tries to bring to life the feelings and thoughts of people trapped in the midst of the KGB bureaucracy in the early eighties. Oddly I feel that the environments described were apropos in the extreme to my own situation. It heartens me a little to recall what happened to the soviet regime, and at least you can opt to leave Telstra without fear of banishment to Siberia. I guess the Telstra equivalent to a gulag work-camp in the Siberian tundra, would be a revenue protection assignment in the billing systems maintenance department, a fate I adroitly sidestepped only a month ago!

Well, here goes nothing…

With my trusty brill-o pad in hand I am preparing to revolutionize my software development experience ™. Yes, the install media arrive through the post the other day for visual studio 2005…

… [Please Wait]

…diddle diddle dum. diddle diddle dum. And we arrive at the present moment. Yes I sat there like a zombie for a whole evening. But the end result is what counts and that’s the problem. Now all of my code doesn’t work any more. Boo Hoo.

So, I have to either debug the EDRA config app block, or go hunting for a .NET 2.0 version of a config tool. I probably also have to do the same with all the templating tools, logging tools, and every other trace of code reuse that I have incorporated into my Dbc system.

Undoubtedly – VS.NET 2005 beta 2 is by a long margin the slickest development environment that I have ever seen. But I really can’t be bothered to go through all this pain at the moment. Perhaps I should revisit the idea of using VPC 2004 for my dev environment. Problem is that there is a whole other bunch of problems associated with that route. So what do I do? I wanna be able to play with the latest toys. But I wanna do it without having to incur weeks of pointless twiddling after I realize my recklessness has rendered weeks of work either obsolete or too bleeding-edge to work. [sob]

Interesting Stats

The graph below I found from a recruitment site. It lists the relative importance of programming language skills and how they have changed over the last few years.

Aside from the fact that C# isn’t even in there, which I find a little alarming, it shows that all of the languages listed are in decline. Especially my beloved C++, king of all languages. Does this graph indicate that there is a proliferation of languages and that the impact of previously major languages is being diluted.

Are we building a babel tower of scripting languages, that will come tumbling down around our ears? Perhaps this was foreseen by Microsoft and motivated them to target language independence of the .NET platform rather than platform independence as was the case with Java?


Cyborg Upgrades

This from LiveScience via Kurtzweil AI: Monkeys Brains Alter to Work Robotic Arm

A new study finds a monkey’s brain structure adapts to treat a robotic arm as if it was a natural appendage. The finding bolsters the notion that the primate brain is highly adaptable, and it adds more knowledge to the effort to create useful prosthetic devices for…

This is a very thought provoking piece of news. To me it seems to point out the way that awareness inhabits the sensorium, no matter what the origin of that sensorium. If the monkey brain can adapt to the use of non-organic appendages, then maybe it is possible that one can go further and progressively substitute pieces of the body with cyborg replacements. It brings to my mind the thought experiments where functionally identical transistorised components are used to replace neurons in the brain. The thought experiment was intended to highlight the difference between those who hold that hard-AI is true, and those who believe that something ineffible is lost in the process, which I guess they would call consciousness or soul or some similarly ill-defined term.

Others have argued that the challenge is in replacing the neurons with functionally equivalent components because of the potential for brain cells to be small quantum computers, but in recent years even that is now seen as a hurdle that can be overcome. I’m encouraged with these results because they silence the objections of those who hold that the ineffable thing that is lost in a cyborg is an organic nature. As though Carbon-based molecules were somehow privileged and able to yield consciousness in a way that other assemblages of atoms were not.

Surely it is the pattern of signals going to and from the brain that counts, not their origin?