Categories
Business Technology

TDD, Functions and Design

A function assigns a dependant variable (the range) to an independent variable (the domain). Alternatively stated, a function will assign a result to an argument. And by that definition, we eliminate “void” methods and properties since these have neither range, nor domain. It is accepted though that these constructs can still alter the state of an object, but the focus of this post revolves on Functions and TDD.

This definition is useful since it also helps to describe bug resolution as a function, or more accurately, as an inverse function. An inverse function is determining the inputs based on the outputs. So when we’re resolving bugs, we’re inverting a particular function. So let’s look at releasing functionality in software…

We implement a function, with both range and domain. At some point in time, that function starts to behave “unexpectedly”, ie. it’s buggy. Functionally, either the range or the domain of the function has contradicted expectations. And the more granular the function, the greater the likelihood is that the domain introduced the bug and that the range is symptomatic. Of course, this can change as the implemented function increases in cyclomatic complexity. But back to TDD…

TDD recognized the domain-range behaviour of a function and attempted to introduce a mechanism whereby the domain-range relationship could be documented in an automated fashion. In doing so, a programmer could efficiently determine where the system changed expected behaviour as the domain and range of the various functions increased through integration. The programmer was then empowered to make empirical-based decisions, quickly, and in doing so progress (hopefully more rapidly) towards completing the various functions. Or at least, that’s the way i always understood it….

Then a school of thought emerges asserting that TDD is more about design, and not testing. And while it is acclaimed (and true, imho) that TDD will bring about a better design, it is still more about testing than it is about design. TDD documents what the expected inputs and outputs of any given function are, and the relationship between the range and domain. When it starts to focus on design, or is utilised with design as the primary focus, it’s no longer TDD, but should be rephrased: Perspective Driven Design, or User (as in caller) Driven Design. In which case, xUnit frameworks, tools and processes are inadequate. Why indeed would statements like Assert.AreEqual() be required to check wether the _design_ works? It’s not, and as such, TDD fundamentally remains, and should always remain, functionally oriented.

Along with TDD come some well understood concepts like defensive programming, as an example. DP has nothing to do with design, but everything to do with functionality. Of course, DP can come without TDD too 😉 But it still remains that TDD is about testing the function, not the form. And wether the form evolves into a good design or not, depends on the programmer. TDD won’t guarantee you a good design. You can still hack it. And it also won’t guarantee you good function either- you can easily write pretty bad tests.

Stick to proper design principles outside and beyond TDD and intelligently utilise a combination of experience, technology and common-sense to get it looking right. And use TDD, primarily, to get it acting right But try not confuse the two…

Categories
Life Technology

Like A Simile

We use similes and metaphors quite frequently, particularly in software. Everything is so figurative and you spend all day coming up with names for things that are “like” real world objects. Of course, that’s not counting those things which are not like anything else other than what they are in software. This is not about them. And this figurative language we carry over into customer engagements too, explaining to our dear customer why the original quote does not hold since they subsequently added the kitchen sink and “tinted windows” 🙂 Not to mention explaining why that takes an extra ‘x’ weeks.
During one such session, it suddenly dawned on me halfway through the metaphor, that the client started extending the metaphor and the proceeded to re-apply it back to the software project, taking it too literally, and ended up explaining to me how software works based on the metaphor at hand. Doh! And all i was trying to do was help educate and understand…

I think IT, in general, has come under heavy pressure to talk the language of the client- and indeed, we are encouraged to keep getting better at this. Make it more accessible so that they understand… Why?

I don’t understand that thing my dentist uses to fill a tooth; nor do i care about the gadget the plumber used to seal the pipes with; and yay for the structural engineer who … well, i really don’t know what he did. But i still use and pay for their services. Did they go to great lengths to explain the intricacies of their tasks? Nope. Did it matter. Not really. I engaged them- they told me how much and how long- i paid, they did it. Moving right along.

I think the more we “passively force” non-programmers to understand what we’re doing, the more confusion we sow. Particularly when the metaphor we use is so accessible, we have no idea how their experiences, assumptions and understandings will impact on the point we’re trying to convey.

As programmers, we owe it to ourselves to forge the language (jargon) we have created, not with the purposes of creating a divide, but so that we can just get on with the job, efficiently. Without the extra fluff. Of course, that’s not to say don’t make an effort to help a client understand, if they want to understand on technical terms. Dumbing it down doesn’t really help, except maybe in polite social discussions 😉

Categories
Technology

Software is Hard

You may have heard it before, and if you haven’t, you need to understand that. Besides all the technology and implementation caveats, competing frameworks and myriad and variety of business rules within the same vertical, there’s also the challenge of finding competent resources.

Coding is the easy part. Almost anyone can do it, given enough time with a “Learn _INSERT_LANGUAGE_ in 24Hrs” and a off-the-shelf PC. Even better, do a condensed one year course and come out the other side “qualified” and even better, command a “qualified” salary. What ludicrosity!

Now that’s not to say one year condensed courses and self-taught programmers are to be frowned upon. Some of the better programmers i have worked are self-taught. What you’re not, as a self-taught, or one year crash course graduate, is qualified. Far from it. And then even 4 years into your career, and some candidates reckon they’ve seen it all: they’re intermediate, looking for a senior post (along with the salary). Inconceivable.

Yet, this is what seems to be the status quo in South Africa right now (or at least, that is, in Cape Town). There is a massive demand for C# programmers (in particular) and the demand has outstripped the supply so greatly that the market is flowing with inexperienced skills demanding (and sometimes getting) the outrageous salary demands they’re placing. And the only reason they’re getting it, imho, is that whoever’s hiring them a) are not concerned with quality b) haven’t a clue what they’re doing or c) are really doing so well financially they can afford the risk. And maybe have a secret to mitigating that i don’t know about…

Of course the greater danger is the software that is out there. And then the pervading negative experience in bespoke software that continues because a bunch of “senior” programmers couldn’t get it right.. Go figure.

Two prominent cases recently gave the issue some visibility. The traffic registration system and the online tax filing systems. The traffic department came to a grinding halt for literally weeks because of concurrency issues on the system. The online tax filing handled well enough during the pilot, but when it was opened to the public, it hung. Badly. And we’re not talking about millions of hits per day either…

Until you’ve got about 7yrs of pure development experience (as a guideline, not a dogmatic rule) i don’t think you’re qualified enough to even contemplate leading anything. Stick around for a bit, get some more diverse experience and when you start approaching 10 years in the trenches, maybe then consider grooming yourself in preparation for more of a leadership role. Until then, you’re ultimately only bluffing yourself. Of course, it doesn’t help that the market regards a one-month MCSD as a competent programmer. It just makes it easier to sidestep your own professional integrity.

Please note, the actual number of years is a guideline and, as with everything, there is a lot more to consider than just historical “age”.

Categories
Technology

How The Other Half Compute

i’ve been a Windows junkie for a long time now. Made my living living, eating and breathing Windows. From ’98 to XP, and now Vista, i been using, coding, configuring, patching, coding [did i say that twice ;)] everything Microsoft. Funny though, my roots in computing go back to C, Borland nogal. Anyways, Microsoft-related work paid quite well, so it’s what i did- and continue to do. However….

Because i work within the education arena of software, it’s only inevitable that i’d come across a project like edubuntu, and from that, get infected by a whole other world. and slowly, but surely, things are getting infected, in a good way 😀

But the challenge remains.. how do i migrate into a linux world? And why even bother would be a good challenge to pose… indeed why? Equally inviting, why not? More on that later.

First step, replace XP with Ubuntu as the primary OS for all regular “work”. Writing documents, emailing and IM, are OS-independant tasks. Developing asp.net, so so much.. but we’ll see [he says with a glimmer in his eye].

Fortunately, and i do mean fortunately, most of the implementation tools we use to develop asp.net are non-ms. Our choice IDE, as an example, is SharpDevelop. And interestingly enough, it won hands-down as IDE of choice against VS.Net [Express Edition]. It’s just soooo much more productive 😀 And in case you’re wondering why not the full fledged VS.Net product: we compared apples with apples and the apples in this case, was $.

Hey! We live in Africa. We certainly cannot afford them shiny-looking price tags.

The adventure begins….

nogal is, as far i know, purely a south african term pronounced nau-chul, with the ch being closer to the classic Scottish ‘ch’ as in loch ness that anything else. kinda like you’re clearing your throat. anyway, it resembles the english equivalent: nonetheless.

Categories
Technology

An Assumption Makes…

Assumptions are wonderful. They allow you to fly ahead without needing to fuss over any time-consuming details… until, at least, the assumption fails you. Like the “ref” keyword, for example.

A data type is a value type if it holds the data within its own memory allocation [Eg. numeric types, bool, any struct types]. A reference type contains a pointer to another memory location that holds the data [Eg. String, any class types].

So, when it comes to argument passing in .Net, by default, all value types are passed by, well, value 🙂 and reference types too are also passed by value. But value in the case of the reference type, is the value of the reference. This little nuance [nuisance, at first] popped up quite late in my project simply because the implemented design didn’t call for any dynamic re-allocating, until one particular test started failing during what was supposed to be routine “refactoring”. I say “supposed to be” because the refactor ended up changing the behaviour hence no longer a refactor… anyhoooo…

My assumption [reference types = raw pointer, ala C++] allowed me to gloss over an entire section of the C# specification. Had i paid any real thought to this little abstraction way back then, i would probably still have forgotten about it when i had to call upon that knowledge for the first time, months later. Or not. At least now i know i won’t *ever* forget about it 🙂

Any which way, i still think assumptions are good. Even better though if you can change them. And best if you can constantly challenge them or have them challenged. And yes, an assumption can lead you off into the wrong direction, but hey! At least it’s a direction. Could have indulged the specification in depth before hand and told my employer to wait n months while i read all about C#… don’t think that would have worked too well either.

As always, some tempering between the extremes is a refreshing welcome…

Categories
perspective Technology

Methodology Wars

It is interesting to see the increasingly varied responses to Agile as it waxes and wanes in popularity. From within the ranks of the new order rises a breed of zealots determined to see their ways overthrow the old. The stoics glare down at this revolution with some contempt and evidence of its untrustworthiness. The evidence itself is backed up by their reputation and that should be enough for victory. Yet the new order marches on, and in between, there’s another minority quietly getting on with fusing the gap…

I guess my view of government organisations is influenced by the incompetence of many. As grossly unjust as my opinion is, i am encouraged by many others, and most recently by: Army Simulation Program Balances Agile and Traditional Methods With Success. All this while i’ve been quietly researching on combining the two [motto: to be Agile enough to do Waterfall] and wondering just how this fusion would play out in a larger project?

Rather sweetly actually: OneSAF is a success!

This does pose a bit of a problem for the stoics and zealots though. Who gets too claim this victory? Or will they both ignore this one 🙂 Or maybe we can expect a new range of books both for and against OneSAF?

Refactoring Agile: How OneSAF Could Bave Been Better.
Traditional DeadWeight: Agile Carries OneSAF.

All i can say is: “Kudos to Mr Parsons, LTC Surdu and the team on some clear thinking!”

Categories
perspective Technology

The Right Tool

It’s a theme which comes up quite regularly: “the right tool for the right job”. From DIY to software, this mantra manifested literally saves you money. Let alone a bundle of emotional energy which gets exhausted trying to fit round blocks into square holes. I can use a knife as a screwdriver, my cellphone as a delicate hammer and chewing gum as glue. As much as they work, they are not the right tools for the right job.

In software, we tend to think of tools as: inter-alia, compilers, languages, debuggers, IDE’s, SDK’s and drivers. Very rarely do we regard our processes, resources and skills as tools. We define these instead as attributes of a project, team or individual. But attributes are abstract things such as value, determination and enjoyment. Of course, we don’t want to offend any person by referring to them as a tool. But let’s rise above ourselves for a moment here, and ignore the connotations of “tool”. In doing so, we can take advantage of the definition to pursue success with greater enojyment.

The challenge: need to respond to aggressive shifts in the market and stay ahead of competition by releasing features rapidly. The tool: Agile.

The challenge: need to carefully plan out the next n months of development in a mature vertical market. The tool: Waterfall.

The challenge: need to research algorithms and implementations of those algorithms on different platforms. The tool: established computer scientists.

The challenge: need to mentor a team of young programmers within a business application product. The tool: pair-programming.

Instead of dogmatically insisting that the “attributes” of your project: Processes, Resources and SkillSet determine the course of your project, let the real attributes: success, enjoyment and value be a product of your project- but manage those just as pro-actively as any other aspect of the project. And while you course through your project, employ the right tools for the right job at the right time.

And just because a hammer worked really well driving that nail into the wall, it doesn’t mean it’ll do an equally fantastic job at attaching the mirror to the wall.

The irony for me though, is that we all instinctively know this and by some subconscious decision making process, we apply this principle quite well to a point. It’s when we don’t apply this principle though that we end up in a: “Houston, we have a problem”.

I think the art of getting this right, is like anything else: be conscious of what you do and decidely know what makes you successful. Like your code, when it unexpectedly stops working, it’s probably because you are not really sure why it was working to begin with…

Categories
Technology

DataBinding on ASP.Net

After consideration from a previous post, i decided to boldly test my new class designs by making those private instance variables public. Beyond struggling against my own niggles, i discovered that ASP.Net doesn’t like breaking PrivateInstanceVariableMandate either. Consider the DataBinder for example.

In your mark-up code, a “typical” repeater section might look like


The DataBinder.Eval however only evaluates properties of that object. Public fields [class variables] are not properties and so breaking with traditions and trying to make things simpler, proved to be more complicated down the line [on this framework].

With a little co-ercion, it’s easy enuff to write a FieldBinder


to allow code such as above. All of a sudden i can access fields and it’s all good. Now the pro’s and con’s of introducing FieldBinder into the system can be debated at length, but the beauty of this insight led me to an even more interesting possibility.


There are times when real-time data associated with an object needs to be retrieved. Making the function a property, in a no-argument case, again, is a design preference but can lead to problems since the property EmailCount conveys a different sentiment to the method GetEmailCount(). The method, for me in my current project, implies that more work is done to retrieve the value than simply looking at one of the already present values. So i really want to keep it a function but can’t bind to it using DataBinder.

Extending this idea further, you can overload FunctionBinder to accept specific parameter types or arrays of objects for the function arguments. How you do that, well, that’s your decision…

SourceCode

Categories
Technology

I Love My TestHarness

now wouldn’t that make a great bumper sticker? 😀

again, yesterday, i experienced the fullness of my beloved test harness. as it always happens, business requirements change; dynamic market pressures or product discovery over time dictate that change is required. now whether you’ve spent 6 months designing before coding or spent 6 months designing through coding [implemented code IS the design], how do you evaluate the impact of the change accurately? how do you go back to the hand that feeds you and estimate the cost of change with confidence [which impacts on a marketing strategy and promise] and maintain that near-perfect delivery track record? and then, how do you know, for sure, that your implemented change doesn’t inadvertently break some other part of the system because the system is now so huge [increasing feature set over time] it’s getting near to impossible to keep it alltogether in one moment?

welcome the testharness!

it was so quick to implement the change [fully tested on it’s own of course] but then integrate the new module into the existing system… the next step was to figure out: where to begin with handling all the other intelligence that relies on the old structures and is impacted by the new structure?

ran the tests. red bar with breaking intelligence over one primary area. there were one or two adhoc modules that were affected. no sweat. the beauty was that i din’t need to comb throught the system to find them. i let my system tell me and in doing so saved myself a load of cognitive energy. now that the buggy areas are recorded i can fix one test a time until the bar is green and walah! 😀

there was *life*, allegedly, before a test harness and now then there’s life with a test harness. i cannot remember what software development was like before. i just know now that more than ever before i truly passionately enjoy my craft! [which also happens to pay the rent ;)]

Categories
Technology

It’s all in the estimate

Estimates form the basis for all software projects. In fact, estimates are part and parcel of our daily lives. We live, plan and act by them. Our expectations are met or shattered based on the estimates we feed into our lives.

Buying food, you might estimate before you set out how much money you need, how long you think you might be and plan supper, a night out, a telephone call based on the estimates you give yourself. When life goes according to our estimates, we’re happy. Everything is running smoothly. When estimates are wrong, we adjust, but sometimes, the ability a bad estimate has to flap it’s wings and spiral out of control can be deadly. Especially for a software project.

Based on estimates, business programmes are set in motion. Budgets are approved and marketing plans are established. The length of the estimate is largely irrelevant. What’s critical is its accuracy. When projects, and hence people’s careers, lives, finances, lifestyles, families, are on the line [ok, maybe a tad dramatic :)], an estimate has the misfortune of not being taken too seriously on one hand, or far too seriously on the other.

Taken too seriously and the time to estimate can be as long as the time to do. Not taken seriously and the time to do is incalculably longer than the time to estimate.

And because estimates can be co critical, it does make sense that the right amount of energy
be invested into getting them as accurate as they need to be, weighed against the cost of getting them too accurate. No solution strategy, technology or skill set is going to set you up professionally if you don’t know how long it’s going to take you to do something, do anything. If you don’t really know how long it will take, it does imply that you don’t really know what you’re doing. And if you establish a trend over time of not delivering when you say you will it shouldn’t come as a surprise if your services become undervalued.

Of course, you can always “buffer” your estimates and play it safe. But in a dog-eat-dog world where the ubiquitous “5-minute solution” marketing threatens your chances of being awarded the contract [or the glory- however inaccurate those 5 minutes are], buffering can be expensive.

Considering the risks, together with the cost of not being accurate, it pays dividends to be more boldly accurate. And to be more boldly accurate, it takes time invested into getting to know your weaknesses. It takes being honest with your progress using feedback mechanisms that might hurt your feelings. It requires that you get better, not just at what you do, but at how you do what you do.