One of the things that I'm beginning to notice about working in a software world, is that it rarely work like it's designed, and when it finally does work, it's probably not quite like how it was initially designed anyway. Having tried to hone my skills at programming, it was this little process that kept pushing me back.
- Write program1
- Run program1
- Write program2
- Realise program1 was pants
- Rewrite program1
- Realise program2 was pants
Anyhow, without realising, you end up at a stage where it doesn't make sense to keep improving your program. Even if it's well within your newfound abilities on writing your next program.
However, this doesn't just apply to individuals, but groups too - and not programs, but features.
Each time a feature is improved it inadvertently gets more complex, then it gets improved more, and more complex, and it gets to a stage where because the initial feature was so simple, that having to read the documentation to do the initial feature is more hassle than just writing out the feature in code, rather than using the library feature.
One of the reasons I believe that the commandline is still going strong after so many years is because it immediately puts a barrier on 'information overload' - sure there are commands which have more flags on them than a the UN headquarters, but other commands have been kept simple.
The other thing that I'm seeing a pattern to, is of is balancing customer expectation.
Google went through a period of releasing all their new software with a BETA tag. Aye, you may say that was just a marketing ploy - but it worked. It was free, it was marked 'not quite stable' and for most of us, it still 'just worked.' Now, compare that to Microsoft 95 when it came out - bells and whistles, CEO announcing that it was the 'future of computing' for us all - only to be followed up by a Blue Screen of Death? So what's changed in the last 15+ years?
Well, we've got from thinking that because computers are binary, they're either right or wrong. I remember someone saying to me once that it was 'impossible to delete a file,' - why? because there is no chemical reaction, only physical reactions, and physical reactions are reversible.
Though many of us may have a laugh at that now, and run for our nearest copy of DBAN to render our harddrives unreadable (or grabbing the nearest hammer, as is the practice at one of my clients) - it's the way people thought. Now people are moving onto say well, Google's not always right, therefore computers aren't always right. Though that may be flawed logic - the conclusions are the same - computers don't always get the answer right. They're even building a new chip based upon 'probability' rather than AND, OR and XOR. How's that going to change computing?
With all these users now getting lackadaisical about the stability and correctness of their computer, people aren't too fussed if their software doesn't work. The majority of the time they've not paid for it, having probably downloaded it (illegally in many cases) - and in other cases have had it supplied by someone else. It doesn't matter if it doesn't work just right. Which brings me full circle.
If someone does pay for software and wants it delivered - they often require it to be correct first time - but how feasible is that given the new complexities of computing.
Discuss.