Navigational Spaghetti -- What are your thoughts?

navigation in timesnapper todaynavigation in timesnapper tomorrownavigation in timesnapper the day after tomorrow [predicated]

TimeSnapper (like a lot of software) has grown organically. From simple beginnings its feature set has expanded to contain possibilities of which we never dreamt.

But this organic growth has meant that the navigational structure within the program has failed to keep up.

Occasionally we get requests for features that are already present -- simply because existing features can be hard to locate. (I call this 'The MS Office Paradox')

Sometimes we get requests along the lines of "I once found this great feature in timesnapper, but now I can't seem to get back there." (let's call this 'The Minos Conundrum')

And sometimes we get suggestions along the lines of "How can I get from screen X to screen Y, in less than five steps, without going through any form twice?" (i call this 'The Bridges of Königsberg Puzzle)

These kind of requests have been steadily increasing over the life of the program so far, and it's not going to get any simpler.

To try and understand the problem, I sat down today and drew a picture of the major forms in the application and how you can get from one to the other. The picture was too big to scan in, so I re-drew it in Visio. It ain't pretty:

navigation in timesnapper today

Then I added in lines for all the major new routes that people have been asking for:

navigation in timesnapper tomorrow

And I immediately predicted where this was headed:

navigation in timesnapper the day after tomorrow [predicated]

So unless we're willing to let TimeSnapper turn into a pastafarian deity, its important we address this within one or two releases.

I don't know what the best solution to this is, and we're open to ideas. If you've got any -- please share.

One thought I've had is that we could include a context menu throughout the application, so that wherever you are you can right-click and get a 'goto' menu that gives you consistent choices.

That way, without cluttering up the interface, we make every path possible. The downside to this is that it lacks discoverability. Another option is to include it as a menu at the top of each form. This would take up real-estate and sometimes seem inappropriate.

We haven't considered including a ribbon-bar, or sticking a giant MS Outlook 97-style bar on every form, or turning it into an MDI style application.

All up it's one of those simple yet thorny design issues that software development is filled with. So I thought I'd share it, and see what ideas were out there.

 

Step 5 of 25 to Building a Micro-ISV: Install traffic monitoring on your web site

(See also, The complete list: 25 steps for building a Micro-ISV)

Online businesses have amazing capabilities when it comes to understanding their customers. With good web analytic software we can know where every potential customer has come from, what they did while on our site, and where they ended up.

To gather this same kind of information, Offline businesses are stuck doing expensive surveys and employing demographics companies. All they end up with is a vague shadow of a reflection of a grain of the truth.

But even still, offline business are willing to pay exorbitant sums for this kind of information. Why do they blow all that dough? Because everything in business revolves around knowing your customer.

And lucky for us, the best things in web analytics are free.

If you want -- the decision of which web analytics provider to use can be an incredibly complex experience. You can purchase a 275 page book from CMS Watch!

Or you can use:

"Leon's 100% guaranteed, absolutely idiot-proof one-step guide to choosing the right web analytics provider"

Here it is...

  1. CHOOSE GOOGLE

Google Analytics (wikipedia entry) is the monster in this field. It's very easy to use, a cinch to install and chock full of information.

We use it at TimeSnapper. My business partner Atli set it up, so i can't really talk about the specifics. There's plenty written about it elsewhere, including some great info about Google Analytics from LifeHacker

But here's quick coverage, on one of the basic web-analysis issues that is particular to people who distribute binaries, such as Micro-ISV's.

Q: How do you use page-tracking software to track downloads, rather than page hits?

For example, when someone downloads your software, they might be downloading a zip file or an msi file. That's not a web page... how can web analytics software keep tabs on that?

I've got two answers to this one. Firstly, you can configure google analytics to record info about links being clicked (that lead to these files) as if they were files in their own right. See info about urchin tracker for help.

The other techniques is that your web server will have tracked every request for such files -- so you can turn logging on with your web server and mine the logs. There's a lot of free software that can make this easy for you.

Some of the packages for doing this (and this is 'seriously old-skool stuff' that your web 2.0 script kiddies can't remember as they weren't born when these things hit their hey dey) include Awstats, webtrends and analog.

What's the end game?

Contrary to common opinion, web analytics are not an end in themselves. It's not just about the pretty dashboards. All of that information is gathered for a reason -- feedback control.

Once you're gathering detailed feedback -- you can begin to gauge the effectiveness of all of the marketing activities that you engage in. And in software, and on the internet -- EVERYTHING is marketing. In fact, marketing is so central to everything you do online, that you don't even need to think of it as marketing. It's just 'being'.

So you re-arrange the layout of your landing page. Was it a good thing or a bad thing? That's what analytics can tell you! You changed the download image, changed your template, re-worded your slogan, put flyers in your local nudist colonies newspaper... did it have any effect? Should you do more of that or less? Analytics! That's where the answers is!!

Well-Regarded Alternatives

I have no experience with these three alternatives -- but if there's a feature you need and Google Analytics doesn't offer it, then check out these three little fellas, because they're recommended by various contributors at the joelonsoftware business of software forum

Increase Your Arsenal!

For the serious infonaut, there's other tools you can use as well.

We've set up Google Alerts to give us a daily email about anyone mentioning TimeSnapper or any other topics of interest to us.

Services like Technorati are also useful for finding out who's mentioning your product, or that of a competitors, or any topic you are tracking.

Where to from here?

I co-implemented these 25 steps a year ago now (with TimeSnapper), and first started to write about them nine months ago.

There's been a long break between writing steps number 4 and 5. But don't assume that i've stopped writing them, nor believe that I've promised to complete them.

If there's a particular step that you're waiting for, or if you need particular support or advice about your Micro-ISV, don't hesitate to email me (leonbambrick at gmail dot com). Other people sure do, and I love to help out where I can.

Also of course, I recommend the joel on software 'business of software' forum as a great place to ask any questions you have. The people there are always helpful, if sometimes a little grouchy ;-)

If you're looking for a book on the topic, I know of two excellent books: The Business of Software, by Eric Sink and MICRO-ISV from Vision to Reality by Bob Walsh. I haven't read Bob's book -- but my business partner Atli sure has. I know this because I get this steady stream of suggestions from him, all of which are dynamite stuff.

Thanks for your patience -- if you see any flaws in anything I write, please speak up, and please contribute any thing extra you'd like to add. I know I've missed out a lot of detail here.

Best of luck.

lb.

 

Thou Shalt Have One Exit Point Per Subroutine -- A Monkey Cage Conundrum

Back when I was a wee lad in Engineerin' school, fussin' with alloc and malloc and all of that hurty stuff, a particular lesson was drilled into us:

Thou Shalt Have One Exit Point Per Subroutine

This was given as gospel and we learnt it as such. It was such a fundamental principle of the coding style they drummed into us that the reasons for it were glossed over quickly -- "it make's maintenance easier".

I stuck to that principle for years -- until finally I thought about the underlying reasons, the trade-offs involved, and realised that the benefits were redundant now. The rule had become, mostly, bunk.

Yet, every time I write a subroutine that has more than one exit point, I find that other monkeys in my monkey cage jump up and down and screech and throw excrement and chatter excitedly.

The psychological aspect of their behaviour was opaque to me until recently when Phill Haack pointed out a particular monkey cage experiment.

So go and read about that monkey cage experiment now. I'll be here. Grinning demonically. Then we'll continue.

So why was that commandment so important once? And why is it no longer of such importance?

The big reason used to be around memory management.

You wanted to make sure that any allocated memory was cleaned up at the end of a routine, regardless of what other logical decisions were made during that routine.

Failure to clean up memory properly was terrible -- not just because it caused catastrophic failure. The real problem was that it caused symptoms that were very far removed from the cause itself. When you have a memory leak in a program, you can't simply read a stack trace that tells you where to go to debug the problem. You often end up having to crawl meticulously over the entire program.

In such cases you curse bitterly at any methods that have more than one exit point, because you can't check them quickly -- you instead have to analyse every last piece of logic.

We're Finally Using the Right Tools

In this age of managed-memory, that sort of memory leak is not an issue (yes, there are still ways to leak managed-memory). And better still, we have two programming constructs (in C# and others) that give the exact kind of flow control we want -- without getting in the way of the program's intent. So if you've got objects to clean up -- use 'using' or 'finally'.

Well, the surface lesson is simple. The context of a 'Best Practice' changes over time, so a best practice shouldn't be considered in absence of its context. And every practice must be revisited every once in a while, to see if the context still applies.

The deeper lesson is that monkeys can't be separated from the tortures they've undergone in the past. Maybe it's futile to try and teach old monkeys new coding paradigms. It's certainly likely to get you pelted with symian scat.

I don't like preaching -- so i'm going to stop right there. But if you're unconvinced, read up on guard clauses, and question your own beliefs a little. It's a deep deep vortex, the inner mind of the inner code monkey.

To finish that topic, here's a comment from Phil Haack that sums up my feelings well:

Get all the failure conditions out of the way at the top of the method so the body can focus on the real meat.


Ah -- one last detour...

I found a java thread on the same topic, wherein the discussion veered onto people doing things because 'that's the way they've always been'.

This lead to a discussion of driving on the right side of the road (the dominant paradigm on earth) and JosAH claimed this had deep roots in times long past:

Riding a horse on the right side of the road in the dark ages made it nearly impossible for right handed sword fighters to combat each other when they were passing each other. It's the 'peaceful' side.

to which George123 replied:

And having the steering wheel on the left side makes it easier to hit the kids in the back seat. Just thought I'll clear that up.

I am enlightened. I am at one.

 

Brisbane .Net Developers: Interested in MOSS integration or K2 Workflow?

A company just around the corner from where I work is looking to build up a strong team of .net developers.

The company is called Image Process Solutions and they're leaders in Record Management, Document Management, "Compliance", and the sort of workflow that these activities entail.

If you're an experienced .net developer in the Brisbane region -- get in contact with them via the phone number on their contact us page or email info@imageprocesssolutions.com.au with a resume and an action packed cover letter.

I've spent a bit of time talking to these guys about what they do. They're early adopters and deep integrators with MOSS 2007, K2 Black pearl and another big-end-of-town product called Meridio. And my microsoft contacts tell me that IPS are the best in Australia at this kind of work.

If you do go along -- then say hi to Desmond, and tell 'em Leon sent ya.

An IPS Logo

(August 30, 2007)

 

General Purpose Programming Language... Good For Nothing?

There's a famous screencast in which David Heinemeier Hansson shows how to use Ruby on Rails to build a blog engine, in under ten minutes.

BUT THIS ISN'T A BLOG POST ABOUT RUBY! THIS IS A MIDNIGHT FRIDAY RANT ABOUT LANGUAGE DESIGN, SO JUST CHILL OUT AND SIT BACK SONNY. CHILL THE F**K OUT. Sorry, just had to clear that up.

Anyway, in one part of this talk he says something like:

rails... doesn't have a special templating language... it just uses ruby evERYWHere...

and i thought "cool -- use it everywhere. Sounds good. Or maybe -- wait! Bad! Bad! Dangerous!!"

then my wandering memory was struck by a few stray shots of Raganwald talking about using Ruby to build a domain specific language and i thought -- "Perhaps... but it'd better be a damn good language."

Y'see, using "one-language-everywhere-for-everything" sounds nice on the surface, but in practice it can be a terrible thing. My friend Zooba recently gave an equolent example of the 'one language to rule them all' concept and it's inherant problems.

But let's dive in as deep as we can. Let's discuss general purpose programming languages. IN DEPTH. Big, bloody awesome terrible depth. Ready?

XML for example has this... general purpose issue.

they (whoever they are...) they say -- "XML is so extensible, you can use it everywhere."

But that really means nothing at all. (I'm trying not to... ... y'know... mention lisp...)

XML is so extensible that knowing xml means nothing -- you can know everything about xml but know nothing about how it's used to solve a given problem. Knowing XML doesn't mean you can write a Nant script. It doesn't mean you can write a stylesheet in XSL. It doesn't mean you can process BPEL4XML.

And just say you do use xml for a specific purpose. XSLT for example. You quickly find that XML as a programming language sucks.

XML is very versatile because it can contain any element you think of. It can do all sorts of wonderful backflips. But no matter what new things it does, it doesn't stop being XML:

it's too verbose to be written by any happy pair of hands. it doesn't lend itself to common programming idioms (for example, you can't really express "If/Else" in XSL -- instead you have to write something more akin to a 'switch' (actually called a 'choose' here) whenever you want a straightforward if/else. this is "shovel-in-the-face-ugly" programming at its finest.)

An Elephant Is The Only Mammal That Can't Jump

Back to the general point: General purpose languages say nothing when they try to say everything.

"Do I know the C programming language? Well Let me put it this way, buddy! I know the entire English alphabet, all the punctuation characters, plus all ten digits... and C is written entirely in the English alphabet, a few punctuation characters, and a couple of digits.... so yes indeed I do know C, mister. I know a heck of a lot more than that. I've used many of those characters my entire professional typing career."

Consider HTML. Initially it was used for both the structure and display of information. Those are two very small, and closely related domains.

But even this was soon found to be very inefficient. So a second, more specific mini-language -- CSS -- was created to handle the presentation aspect.

This is a trend we see over and over -- little languages are created to handle specific little problems. Any big solution will tend to contain a lot of these "little languages".

Even XSL includes an extra little language -- XPath. (While I dislike XSL, I still have a kind of sweet spot for XPath).

A big solution might contain HTML, CSS, XML, SQL, Xpath, Regular Expressions and a number of other little languages. And on a managerial level it might seem messy, ugly, disorganised... but no it's beautiful, efficient, stylish, in a kind of eclectic bohemian way.

And the next point in this "Midnight Friday Rant" is this:

languages are themselves composed of little languages
with other little languages bolted on the side...

...and programs written in those languages:
create extra little languages on top
of the little languages underneath

i'll go through some of those claims, a little more slowly.

languages are themselves composed of little languages...

You remember from the hazy hungover university days that language grammars are defined in EBNF - "The Extended Backus-Naur" Form.

For example, here's a chunk of EBNF defining the if statement in powershell.

<ifStatementRule> = 'if' '(' <pipelineRule> ')' <statementBlockRule> [
'elseif' '(' <pipelineRule> ')' <statementBlockRule> ]*
[ 'else' <statementBlockRule> ]{0|1}

Notice how this rule is built on top of other rules -- the statementBlockRule, the pipelineRule, and a sprinkling of keywords.

Language grammar is a conglomeration of 'rules'. And those rules are built of smaller rules. That's what I mean when I say 'languages are themselves composed of little languages'. It's turtles all the way down.

And I find this a pretty intriguing thought because it means that you can alter the language itself by altering any one of the little rules on which it's built -- you can even imagine expanding any one of these rules to become an entire programming language in itself.

C# 3.0 for example introduces a new argument modifier -- 'this'. It's only a little part of the written language -- but it gives the compiler a hell of a lot of groovy work to do, and it gives the coder a lot of comfortable features, that overcome a lot of the 'execution in the kingdom of nouns' problems that C# suffers.

When you think about these mini languages one by one -- you see opportunities for changes (some good, many bad) and you see the limitations that the language designers imposed on the system. A lot of the potential for clever innovation is beyond our imagination. There's potential here that we can't kid ourselves into believing we can see. (hell, even the wisest language designers get their best inspiration by stealing off others)

Any one of these rules can be expanded into a language of its own. I'm thinking macro languages. I'm thinking meta-programming.

Take this line of C# 2.0 for example:

public class LinkedList<K,T> where K : IComparable<K>,new()

That final clause means the type K must implement IComparable and have a default (parameterless) constructor. This dense little phrase is a mini-language at its most naked. And I wonder why the language designers stopped here, since they'd already gone so far.

A different mini-language that I'm wanting to see in C# is the use of 'pattern matching' -- a fancy terms that means allowing various method overrides to distinguish themselves not just by the types that they accept, but the values too. The sort of overrides we have in C# today distinguish themselves by type: "this over-ride accepts an object while that over-ride accepts a string". "Pattern-matching" is a language feature (used in many functional languages) where different over-ride distinguish themselves by the incoming values they accept: "this over-ride accepts an integer less than 0, while that over-ride accepts an integer greater than or equal to zero". It's a versatile idea that pushes the complexity into a mini-language, where it can be expressed most concisely, rather than leaving it in the mega-general language where it's cumbersome and ugly.

On to the next point:

General Purpose Languages... have... other little languages bolted on the side...

This isn't a re-hashing of the previous point, but an entirely different point altogether.

A general purpose language includes, as part of its core, libraries that implement other, syntactically discreet little languages.

Regular-Fricking-Expressions, for one. A beast of a mini language. A DSL on steroids. If you think your language is better at expressing regular expressions than regular expressions, then hats off to you and I hate your language to bits and pieces. Regular expressions are a world unto themselves, perfectly matching a specific and nasty little knot of problems, and perfectly creating a far nastier little knot of problems, of course.

But look at other mini languages expressed in .net (for example). The string formatting language! The asp.net binding language! Pretty much everything about data access hints at a type-less interpreted mini-language bolted onto the side. (To back up that ambit gambit, here's a quote from Eric Meijer's paper, [PDF!] static typing where possible, dynamic typing when necessary:

"the fact that the datareader API is �statically� typed is a red herring since the API is a statically typed interpretative layer over an basically untyped API (sic)"

(found thanks to local legend Joel Pobar)

[special note to editor-geeks: there's such bliss in quoting a language master, yet getting to use a 'sic' disclaimer...]

So no matter how sweet a 'general purpose language' like C# may be, it still includes interpretative layers over an other, domain-specific API's, as part of the core framework. This indicates that maybe (big maybe) there is not now, nor ever will be an efficent and self-contained general purpose language.

Learn a dozen languages, fat head. You need them all.

And we implement... extra little languages on top

So the next big issue is that by writing our own programs in a given language, we create new mini languages of our own. Once we've written our Database Access Layers, then a blub programmer can come along and compose their own 'business solution' entirely using our domain-specific terminology of Customer and Order and Address and so on.

In C# you get to define new types -- that's the basic method of extending the language. Those types can be inherited etc, can have specific methods and so on. Operators can be overloaded. A couple of other methods. Plus extensions methods in 3.0. That's about it.

Other languages boast that much language creation can be achieved, essentially (as far as I can tell -- novice, me) because noun, verb, symbol, keyword, bracket and so on can be swapped around -- and thus claim that 'new programming constructs' can be built on top of the underling General Purpose Language

This is where the versatility of our general purpose language really comes into play. This is the area where the lisp advocates begin to salivate and kiss each other in rabid orgies of delight. Sickening stuff. You notice they were pretty quiet during the earlier segments. This is where the versatility of XML comes to the fore.

For example, long ago, people worked out how to turn Lisp into an object oriented programming language by implementing something that came to be known as the Common Lisp Object System on top of common lisp.

There's a neat concept (The Principle of Least Power) raised by our old buddy Tim Berner's Lee (inventor of the intarwebs) who explained that the dumber a language, the more re-usable the data stored in the code itself becomes. A real ability to build languages on top of your languages seems, for now, to be pinned to the underlying languages Parse-ability.

Some languages proudly claim to have the same kind of explicit extensibility as lisp, when all they really allow is 'inventive use of eval and some clever string munging' (hat-tip don) but i digress. While digressed on the eval topic.... i did say last week that:

...i think i can use eval() to make it far more powerful yet.

... well i went ahead and used eval to make the world's simplest code generator (javascript edition) far more powerful... no time to explain it now... view source if you need help ;-). Digression over.)

While discussing language over-lays, something I regret here is that while languages boast an ability to pile new language languages on top, there's never an ability to clamp down on the complexity of the underlying languages... I'd love, for example to be able to switch off aspects of certain languages. In Visual Basic, turn off the non shortcutting 'and' and 'or'. Or, In C# for example, (as happens in Spec-sharp) the ability to make reference types non-nullable -- or, as in F#, an inability to move through an if statement without specifying a corresponding else statement. [possibly bogus example -- no comments required]

Balancing all these things, I think the way forward is Javascript.

Kidding. Just tired and keen to send this out. It's almost 1 am, and i've got matt's pre-wedding chug-a-thon tomorrow. Wish me luck and remind me to keep it together.


disclaimer: while i love languages -- i'm still just a language fanboi. i never did the language subject at uni. i never read the dragon book.

 

A weird casestudy in technical research

I used SQL profiler to grab the SQL that a crystal report generated.

Here's a snippet:

AND "Customer"."OrderDate"<{ts '2007-08-17 00:00:01'})

The curly brackets and the 'ts' looked strange to me. Not something I'd used before.

I wasn't sure what sort of thing 'ts' was. I googled it -- and found nothing. I checked SQL Server books on line, but no luck there either. Squiggly brackets and 'ts' are, by their nature, very hard to search for.

So I tried some variations to see what happened:

Select {ts '2007-08-17 00:00:01'} -- returns a date
Select {'2007-08-17 00:00:01'} -- fails (Syntax error or access violation)
Select ts '2007-08-17 00:00:01' --fails (Invalid column name 'ts')
Select ts('2007-08-17 00:00:01') --fails ('ts' is not a recognized function name)
select {ts '1'} -- fails (Syntax error converting datetime from character string)
Select {as '2007-08-17 00:00:01'} -- fails (Syntax error or access violation)

I figured it's some kind of special built in thing. And if this exists, there must be other special built in things I don't know about.

I wrote a short powershell program that generates all the combinations of two letters "aa, ab, ac..." right up to "zz". And then (using WSCG) I generated a monster T-SQL script of this form:

select 'aa'
go
Select {aa 5}
go
select 'ab'
go
Select {ab 5}
go
--(and so on up to 'zz')

Now I ran that script to see if any other two letter combinations stood out, and returned different error messages to the rest.

It turned out there were three such pairs of letters that produced a different error result to the rest:

  1. fn
  2. oj
  3. ts

Now googling those three all together was enough to get me a result, and thus find the meaning and documentation on 'ts'!

'ts' -- incase you didn't guess -- is short for 'time stamp'. These squiggly bracket sequences come to us from the world of ODBC, and crystal presumably uses them because it is trying to be platform (and region) agnostic.

The following is from "Writing International Transact-SQL Statements":

ADO, OLE DB, and ODBC applications should use the ODBC timestamp, date, and time escape clauses of:
{ ts 'yyyy-mm-dd hh:mm:ss[.fff] '} such as: { ts '1998-09-24 10:02:20' }
{ d 'yyyy-mm-dd'} such as: { d '1998-09-24' }
{ t 'hh:mm:ss'} such as: { t '10:02:20'}

And this from google groups...:

ODBC provides a ODBC syntax for dates ( { d 'yyyy-mm-dd'} ), timestamps { ts '...'}), functions ( { fn RIGHT() } ... ) and even outer joins ( { oj .... }). So, if you write using ODBC syntax :

    SELECT * FROM MyTable WHERE MyField = { d '2000-12-31' } 

each ODBC driver translates it into native syntax of its underlying engine before sending the statement. You can see the documentation of ODBC syntax in the Appendixes of ODBC Programmer's Reference (MDAC documentation).

 

The Next Mike Gunderloy

You know I love Mike Gunderloy as much as any heterosexual, god-fearing, code-loving, dude can love another dude. Which is a lot.

But you also know that Mike Gunderloy has, for ethical reasons, decided to shun microsoft as much as any heterosexual, god-fearing, code-loving, dude can. Which is a lot.

And I respect that. A lot. Word, Mike. Word. But I have the bills. And the cynicism of cynicism, and the 'satire-as-a-defence' and so forth. So I stick with microsoft for the foreseeable.

Furthermore, and far sadder: Mike tells us that by the end of this year he plans to give up writing his popular daily microsoft news blog, 'the daily grind'.

So the search is on -- where can I get my succinct daily fix of news relevant to my dayjob, without having to trawl through all the muck myself?

Who's going to get up early so that I don't have to?

A few contenders have stepped into the arena.... what do you think of them?

I highly recommend all of the above.

But are any of them worthy to be granted:

The Inaugural TimeSnapper Professional 'MikeG.Next' Honorary Award Of Linkblogging Excellence?

This a prestigous prize [I, ah, just invented it, so i should know.] The winner of which will win five (5) timesnapper licenses to distribute on their blog, plus one (1) for themselves.

If any other Micro ISV's want to throw some licenses into the prize pool, please step up! I will thank and link and smile ;-)

Andrew Garrett says this (and a whole lot more) about link blogs:

"A blog that�s nothing but link posts don't serve any real value..."

I disagree altogether. If a link blog is a regular and reliable filter of the endless crap out there, then I think they add tremendous value.

It's a limited market with room for only a few players, yes, but it's of more value than all the social-aggregation populist web 2.0 link sites put together.

In one month's time (September 17) -- we'll look at the nominated linkblogs, and award a prize. Together.

And then, as our wise master said in daily grind 1001:

After Satori, chop wood, carry water.

 

Look at all the things I'm NOT doing!

I read a nice lament about Time Management at Matt Casto's blog and wanted to give my own 'Time Management' tip, specific to .net geeks like you and I:

Make a list of what you DON'T have to do.

have you heard the saying:

"you don't know what you do until you know what you don't do!"?

for us .net geeks it's important to regularly say:

"okay, here's are some cool and fascinating technologies that i would learn if i had the time, but i'm NOT going to learn them, because I am but a single humble limited human being, not all things to all people"

My Current list of "WON'T DO" learning items is:

And then a second list:

"here's some great stuff that I'm NOT going to learn in depth just yet, because they're still too new. For now, I only need to know the basics"

  • Silverlight
  • Iron Ruby
  • SQL 2008
  • VS 2008

And then I tell myself:

"When they are more mature they will be easier to learn, and more fun to use"

Remember how frustrating XAML was when the tool support was absent? Remember writing WSDL files in notepad? Did you ever try to use Notepad to write windows forms? It's not a good way to spend your years! There is a benefit, but it's outweighed by the cost. When a technology goes from 'bleeding edge' to usable... well there's a lot less bleeding involved!

If I am suddenly thrust into a project where I'm expected to KNOW these technologies inside-out YESTERDAY!... I'll be okay. We are all masters of "Just In Time" learning. It's cool. We can handle it.

If you still feel overwhelmed by the pressure to keep up... go back and read 'You Are Not Inadequate'.

(End of feel-good-pep-talk-for-my-own-benefit-following-teched-overload)

(p.s. Similarly there is a list of web pages that i stop myself from visiting. XKCD. Icanhascheezburger. YouTube. WorseThanFailure. If someone has a funny XKCD comic to show me -- i ask them to email it to me, rather than let myself get drawn into that vortex...)

(p.p.s. The title 'Look at all the things I'm NOT doing' is taken from something DHH said in a popular webcast where he's building a blog engine from scratch using ror. In my head, i can hear his voice saying it now.)

 

World's Simplest Code Generator: implemented in pure Javascript

worlds simplest code generator implemented in pure javascript

Last night I was stuck using an old laptop, with no dev tools and no internet connection.

A machine without dev tools is like a pub without beer, i lamented, when i suddenly remembered Douglas Crockford's essay 'Javascript, the world's most misunderstood programming language.'

So I fired up notepad and set about re-writing "the world's simplest code generator" -- in pure javascript this time.

This was a fun and liberating exercise: programming without any tool support (remember Chucky Petzold's Does Visual Studio Rot The Mind?).

There were a few unexpected delights and challenges, that i'll talk about now. If you want to use (or bookmark) this new, faster version of the WSCG, i've polished it up a little and uploaded it to here:

world's simplest code generator (javascript edition)

the core of the program is hardly ten lines long, two simple functions:

function calculate() {
  //split the data into an array of rows
  var dataRows = $('txtData').value.split($('rowDelim').value);
  for (var i = 0; i < dataRows.length;i++) {
    //apply the pattern to each row
    result += apply($('txtPattern').value, dataRows[i], $('fieldDelim').value) + '\r\n';
  }
} 
function apply(pattern, row, fieldDelim) {
  //split the row into an array of fields
  var fields = rows.split(fieldDelim);
  for (var col = fields.length; col > 0; col --) {
    //replace '$1' with fields[0]... and so on for each field.
    //(applied in descending order, so that, for example, '$10' isn't mis-interpreted as '$1')
    pattern = pattern.replace(new RegExp('\\$' + col, "g") , trim(fields[col-1]));		 
  }   
  return pattern;
}

The actual code is spaced out a little more (lines aren't nearly so long)

Because it requires no postbacks, this version of the WSCG is the fastest yet. It doesn't have all the features of the previous version but i think i can use eval() to make it far more powerful yet.

Now something else I've been thinking though, is that I'd love to rewrite this in F#. Anyone care to help out? A silverlight implementation would also be fun. I've written a powershell version (in one horribly long line...) and i should blog that some time.

(Side bar: there was a good hanselminutes about F# recently, where Robert Pickering was interviewed.)

As it turns out, the thing I found hardest about writing this without internet access was testing the seemingly simple regular expressions used in the little 'trim' functions i had to write:

function ltrim(input) 
{
   //if it starts with spaces... remove them!
   return input.replace(/^\s+/gm,'')
}
function rtrim(input) 
{
   //if it ends with spaces... remove them!
   return input.replace(/\s+$/gm,'')
}
function trim(input) {
  //trim spaces from the front and the back of the string.
  return ltrim(rtrim(input));
}

If I'd had internet access, I would've used David Seruyange's excellent regex testing tool.

One last thing: it felt unnatural to write a descending for statement:

  for (var col = fields.length; col > 0; col --) 

Have gotten very used to foreach.

 

Argument Modifiers: 'ref', 'out', 'params' and 'this'

Some c-sharp devs must slink through their whole skanky career, utterly ignorant about argument modifiers.

There's usually a way to avoid thinking about them: if you're willing to write a lot more code and have your existing stuff crash and get patched many more times.

The best known 'argument modifier' is probably 'ref' -- and it's purpose is pretty damn powerful. I assume you know its purpose: but just imagine you've just gone from a state of not knowing what the ref param modifier does, to a state where you know what it does. Think of all the things you couldn't do before that are now open to you. Hold that thought.

I'd say that I use 'out' possibly a little more often than I use 'ref' -- yet I wouldn't rank it quite so high on the 'mind-blowing' stakes. The 'params' argument modifier is certainly mind-blowing, if you've never used it before. It gives the language a whole big chunk of 'affordance' -- allow you to achieve, very simply, ranges of motion that are otherwise almost inachievable.

But what about the 'this' modifier.

I'd wager that 70% of the C sharp devs out there haven't yet parsed the 'this' argument modifier.

And when they do -- i hope to see the nightsky light up with the colourful spectacle of a hundred thousand minds noisily exploding under the power of extension methods.

If you're part of that sad 70%, go now and learn how to implement an extension method or ten.