URL vs URI vs URN

Copied from Wikipedia; click link to see originalWhat is the difference between an URL and an URI? What’s a URN? Whilst I’d heard the terms, and assumed there must be a difference, I’d not much cared what that difference was. URL was good enough for me and I tended to use it exclusively

A couple of days ago, I attended a briefing at Microsoft in Reading (, England for the non-British amongst you) that included a presentation on .NET support of REST by Mike Taulty. Mike is very good incidentally, even if he does deliver hundreds of little nuggets of useful knowledge at supersonic speeds, leaving your brain slightly jellified after an hour’s presentation. At this briefing, he commented upon not knowing the difference between URL and URI and apologised if he used the terms wrongly. This set me thinking: should I actually care what the terms mean and try and use them correctly myself?

So today I got off my backside (metaphorically speaking) and read up on the matter. My conclusions are that I wish I hadn’t bothered. The subject is a minefield of utter techno-babble.

Continue reading “URL vs URI vs URN”

Launching Products, The Microsoft Way

microsoft.pngLater this month, Microsoft will be launching three “new” products at the first of their “HEROES happen {here}” events:

  • Visual Studio 2008
  • SQL Server 2008
  • Windows Server 2008

Now to my mind, “launch” and “release” ought to be synonymous terms here. There is a problem though, VS2008 was released last year and SQL Server will now be released later this year. So what exactly is this launch supposed to be all about if only Windows Server is actually being released at that time?

Francois Ajenstat, Microsoft SQL server product director, helpfully explained what was going on with SQL Server. Apparently it won’t be late. Instead:

…To continue in this spirit of open communication, we want to provide clarification on the roadmap for SQL Server 2008. Over the coming months, customers and partners can look forward to significant product milestones for SQL Server. Microsoft is excited to deliver a feature complete CTP during the Heroes Happen Here launch wave and a release candidate (RC) in Q2 calendar year 2008, with final Release to manufacturing (RTM) of SQL Server 2008 expected in Q3. Our goal is to deliver the highest quality product possible and we simply want to use the time to meet the high bar that you, our customers, expect.
This does not in any way change our plans for the February 27 launch…”

So apparently only having an alpha test version available, rather than the finished product, in no way changes their plans? Not sure I follow that one. If you are left somewhat confused as to what Francois is whittering on about in the above piece, you are not alone. Phil Factor has helpfully written an article that explains such marketing-speak

TDD is Proven to Improve Quality and Productivity. Allegedly.

logo100.pngPhil Haack recently wrote about a paper published in the Proceedings of the IEEE Transactions on Software Engineering entitled “On the Effectiveness of Test-first Approach to Programming”. The paper reported the results of an experiment into the relative merits of the test-then-code approach over the code-then-test approach and proudly concluded:

We found that test-first students on average wrote more tests and, in turn, students who wrote more tests tended to be more productive. We also observed that the minimum quality increased linearly with the number of programmer tests, independent of the development strategy employed.

As Phil’s post explains, the results are not the end of the discussion, but they appear to provide the much requested evidence that TDD is a cost-effective and worthwhile tool that ought to be used when developing software.

However the conclusion above and the paper’s results don’t quite tie up. Jacob Proffitt writes an excellent critique of the paper and its conclusions. He points out that the results show better quality and productivity when unit tests are written after the code. Continue reading “TDD is Proven to Improve Quality and Productivity. Allegedly.”

1000 Spam Comments Blocked by Akismet

Update, October 2015
I’ve recently been reviewing older posts and came across this one. In those seven years, the number of spam comments blocked has grown to nearly 300,000. This spam blocker remains an amazing tool!


I’ve been writing this blog now for six months. During that time there have been just over 60 legitimate comments to my posts from the small audience I’ve built up. During the same time, the absolutely wonderful tool Akismet has trapped 1,000 spam comments. I used to think that email spam was a complete pain, but it is nothing compared with blog spam. And the stuff is psychologically very clever and thus even more evil than email spam. Blog spammers put a lot of effort into encouraging the blog owner to override the advice of the likes of Akismet and to mark stuff as not spam. Thus most spam starts with things like:

“You are Great. And so is your site! Awesome content. Good job guys!”

“I am so thankful for finding your website! ”

“You have an outstanding good and well structured site. I enjoyed browsing through it.”

Those kind words are then followed by (often many tens of) links to sites selling fake Viagra, pirated software etc. Most overdo the links and so it’s easy to agree with Akismet’s assessment of them as spam. Others though are full of honeyed words and just a single link associated with the name (which legitimate commenters are allowed to have too). It is more difficult to resist marking these as not spam, though a quick visit to the website in question generally convinces me to resist!

So thanks to those that have posted genuine feedback, and a very big thanks to the Akismet team for saving me lots of hassle of manually removing spam comments. Also I’d like to wish all manner of uncomfortable and embarrassing diseases on the spammers. Finally I’ll make a plea to those readers that haven’t yet posted a comment to speak up more. Everyone’s feedback is always very welcome.

IE8 Promises Both Backward Compatibility and Standards Compliance in One Box

Internet Explorer 8Back in the “good old days”, when Netscape had lost the browser war and everyone (bar a few Linux geeks; but no one cared about them) used IE6, the life of a web developer was easy. One simply developed a website, checked it looked OK in IE and then published it. Then Firefox appeared and started to steal back some web users from Microsoft. This led to the need for web developers to implement “standards mode” pages. What this basically meant was that they specified the document type so that Firefox knew how to render it and then worked around the oddities of IE6 (which infamously had its own “standards”, which weren’t that standard really).

All was well for a time. Then IE7 appeared on the scene. IE7 was a huge paradigm shift for Microsoft. It actually sought to be standards compliant. So it rendered “standards mode” pages according to the real standards, rather tha IE “standards”. In so doing, they “broke the web”. In reality naive, slapdash web developers broke the web by supplying IE6 work-arounds to all IE6+ versions, rather than targeting them specifically at IE6. This meant that their sites failed to work properly when an IE7-equipped user visited their site. Microsoft were blamed for this however, for those pages worked fine in IE6 and not in IE7 and so obviously (people are stupid, lazy and blame the “obvious” all too often) IE7 broke things.

Our little story now brings us to 2008 and IE8. Microsoft tried to be standards compliant with IE7 and got their fingers burnt. So will they abandon such lofty ideals with IE8 and leave the quirks in it? Well yes and no is the answer. IE8 will be the most standards-compliant of all IE versions. IE8 will simultaneously also be fully backward compatible with IE7. What is really great though, is that the website designer, rather than the user, gets to choose whether IE8 is compliant or backward-compatible. By default, it acts like IE7. If however you want it to comply with the standards, simply put:

in the header of your page. This simple addition then turns on the full standards-compliance mode when rendering your page.

Even the most anti Microsoft people must admit that that is a pretty neat solution.

You can read (far) more about this by visiting this Bink.nu article on Standards and IE8 and by reading Beyond DOCTYPE: Web Standards, Forward Compatibility, and IE8.

UPDATE:
If the above has left you think “Uh?”, here is a post that explains the whole lot far more simply using toy lemurs. The author – Kate Bolin – is clearly rather bonkers! 😀

Forget Web 2.0 and Web 3.0: a real web upgrade is on the way

W3CIn recent years, we have seen “Web 2.0” technologies push HTML to its limits in terms of making the web act like a set of applications. Google docs and spreadsheets are representative of how far those limits have been pushed, but they also highlight how short of being true applications, HTML-based apps are. This has prompted talk of “Web 3.0” whereby HTML will take a back seat to Flash and Silverlight based Rich Internet/ Interactive Applications (RIAs). Part of the reason for this push to Web 3.0 is that it has been seven years since the last HTML standard (XHTML 1.1) was approved by the W3C. The standard that underpins the web is ridiculously old in internet terms. Four years ago, talk began over HTML 5. Four years on, today finally sees the release of the working draft of this new standard. If the W3C can pull its collective finger out and get the draft ratified in the next year, then it will be a very exciting release. If they continue at their current snails pace, then sadly they will likely have missed their chance and the web will be destined to become a Flash and Silverlight hell-hole.

So why is HTML 5 so exciting? The answer is simple: it fixes just about all the current shortcomings of HTML that Flash, Silverlight, PDFs and Quicktime seek to fix without needing plugins for all of the above (and possibly not getting a plugin for one of the above if you are on the “wrong” browser or operating system.) It also fixes a bunch of problems that none of these plugins successfully fixes by themselves:

  • First there is the new <canvas/> element. This element allows on-the-fly drawing on an HTML page, with lines, gradient fills, drop shadows, transforms and the like using JavaScript. Think Silverlight 1.0 built into the web browser with no need for a separate download.
  • Next there is the <movie/> element. No more having to embed the Flash/ Silverlight/ Quicktime/ Media Player object within the page; just use the built-in tag.
  • Want to do documentation in HTML? Fed up with the lack of proper mark-up tools? Currently this drives people to use PDFs (or, if they are really clueless or just joined to Microsoft at the hip, Word Docs) instead. The inclusion of sections with headers and footers, figures, asides etc now provide a much richer suit of mark-up tools for HTML documentation.

The really clever new stuff though includes:

  • History and location support. The page will no longer be limited to trying to trap the page back event, it will now be able to find out its own history in terms of the page back and forward URIs and state models (state objects that the page can define).
  • Storage. Persistent local storage that includes a SQL database will be available to the page for storing state data locally on the user’s machine.
  • Offline application support. The client will provide an application cache, allowing an HTML-based RIA to ensure the JavaScript, images etc required for it to function correctly offline are stored in local cache before the application starts. So a blog entry editor application for example might download recent posts to local storage and its application model to cache. Then when the user edits a post and saves it, it could be re-saved to local storage for later synchronising if off-line, or uploaded immediately if on-line.
  • Documents become editable. Set the designmode attribute to “on” and the whole page becomes editable. Will the whole web become a giant wiki?

The scope of the changes from v4.1 to v5 of the HTML specification is vast and I’ve only touched on a few aspects. If you want the full story, grab yourself a huge mug of tea or coffee, set aside a few hours and read the HTML 5 Working Draft.

.NET Source Code Released to VS2008 Developers

visualstudio.gifBack in October, I wrote a blog post about Microsoft’s plans to make the source code of .NET available under a “look; don’t touch” license to VS2008 developers. At the time, some were predicting it was the end of open source, and some were predicting it was the start of Microsoft going open source with .NET. Whether either extreme will come to pass is unknown (though also highly unlikely in my view), but time may well begin to tell. As of this week, Microsoft have begun the roll-out of that source. More details (including how to access the sources) are available over on Scott Guthrie’s blog.

C# and Friend Assemblies Made Easy

friendassembly_small.pngIf you are familiar with C++, you’ll know about “friend classes” within that language. If you are not familiar with them, they are a neat way of exposing otherwise private methods to trusted classes. Class A has a bunch of private members, that are obviously hidden to other classes, but there is a problem with this model. Class B is a unit test class that tests class A. Because it is doing unit tests, it needs to delve deep inside class A to check what is going on. By making class B a friend of class A, the former is then granted access to the latter’s inner workings.

So what does this have to do with C#? Well C# partially fixed this with the “internal” keyword. Internal members of a class are visible to other classes within the same assembly, but are effectively private as far as classes outside the assembly are concerned. Internal members are very different from protected ones by the way. In the latter case, they are only visible to subclasses (whether in the same assembly or not). However, largely due to a limitation with Visual Studio, internal members used not to be the full answer to the needs of unit testing. The C# compiler supports compiling multiple projects into one assembly, but Visual Studio has a strict one to one project/ assembly mapping. So unit test classes used to have to reside within the same project as the product code (which wasn’t practical if the project’s assembly was a executable, rather than a dll.) The other alternative was to expose all the members that the unit test classes needed to access by making them public.

I talk in the past tense in the previous paragraph as this problem was fixed with the release of C# 2.0. This introduced the idea of Friend Assemblies. To explain, let’s go back to our class A and B. Class A resides in assembly A.dll. Class B resides in assembly B.dll. Normally class B would have no access to the internals of class A. Add the “InternalsVisibleTo” attribute to class A though, specifying B.dll as the trusted assembly and now class B can access those internals. Further, unlike with C++ and friend classes, private members in class A remain private; only internals are exposed via this method.

If both A.dll and B.dll are unsigned, the extra code is just:

which is a pretty neat solution.

Things get really messy though the moment signed assemblies are used. The reason is that one must put the public key (not the nice short token, the full-blown humongous key) in the attribute. So it becomes something like:

(I’ve wrapped the public key across many lines for clarity. In reality, it must be on one line)

The whole horrible nastiness of the matter doesn’t stop there though. To get that public key, one must find the sn.exe tool, plough through its dozens of obscure options, and then run the right one against the required assembly (or run a sequence of two even more obscure commands against the original key file). Then one can cut and paste the output of the command into the attribute in Visual Studio. This is what we in Britain politely term a “complete faff”.

Having banged my head against the wall performing this operation recently, I decided that a simply little utility was required to do all the work. So I wrote an app called FriendAssembly, which I’m sharing here.

Continue reading “C# and Friend Assemblies Made Easy”

MacBook Air: Style Over Substance or Another Winner for Apple?

It is one of the worst kept secrets in recent years: Apple were due to unveil a slimline Mac laptop at this year’s Macworld expo. And Steve Jobs dutifully did reveal it. It costs a small fortune; has a low speed CPU; has no built in DVD drive. However it is claimed to be the slimmest laptop in the history of the universe and only weighs around 3lb (less than 1.5kg).

macbook_air.png

So what does one get for $1799 (or around $2400 if you live in Britain, but it is OK, we are used to US companies ripping us off. Adobe charge double for their products here, so 33% more expensive than the USA ain’t that bad)? A 4200 RPM (ie sloooooow) 80Gb hard-disk, a 1.6GHz Core 2 Duo (the substantially cheaper, but heavier MacBook offers a 2.0GHz CPU) and a super-thin, super-light case. The Macworld site has a good summary of its specs.

Obviously the Mac fanboys are already discussing buying it. But should the ordinary person buy one? In my view, this is a step too far down the style-over-substance path by Apple and it would be a bloody stupid idea to buy one. If you want a Mac laptop, buy a MacBook Pro ($100 more for so many more features). If you want a lightweight laptop, buy an Asus Eee PC and save yourself $1300 in the USA or £1000 (around $2000) here in Britain.

Problem Solving the Microsoft Way, or How to Get Silverlight Onto Every Desktop

beijing.pngProblem: Silverlight is new; hardly anyone knows about it, so hardly anyone will use it (why bother when they already have Flash?)

Solution: Provide users with exclusive access to over 3000 hours of live and on-demand video content via Silverlight streaming. And not just any old content either, we are talking the Olympic Games here. Microsoft have secured a deal to provide the online coverage of the Beijing Olympics in August this year and all that coverage will use Silverlight, requiring everyone who wants to watch the stuff to install the Silverlight (version 2 apparently) plugin.

This is a clever move by Microsoft, and a brilliant news for RIA developers concerned at the lack of market penetration of Silverlight. After the games, that penetration rate should have jumped from near 0% to a very significant percentage of the world’s computers.

More details on Somasegar’s blog

Bill Gates’ Last Day at Microsoft

Bill Gates, at his final CES 2008 keynote speech, showed the world that when you are the richest man in the world, not only can you take the mickey out of yourself, you can rope the likes of Bono, Steven Spielberg, George Clooney and Al Gore into helping you. The spoof movie of Bill’s last day at Microsoft is buried inside a glitzy and tedious hour long keynote. Luckily a decent soul, Long Zheng over at istartedsomething, has taken the full keynote, chopped the drudge out and left just the spoof behind for all of us to enjoy. Click on the flash below to see it, or head over to istartedsomething for the full size version.