If you work in computing or study it for any length of time, you'll find that the more beardy someone is, the more they embrace and promote unnecessary complexity and the more they frown upon those who would rather live their lives than debate the relative merits of type safe variables.
In many ways, computing is a stunning example of the outrageous pace at which we have managed to develop technology which has changed, and continues to change, the world. In others, it's a frustrating reminder that basically a unique club of furry toothed geeks have managed to continue to keep the doors firmly closed to mere mortals for far too long and shows exactly what happens when you let a group of "unique" people shut themselves away unattended for a significant period of time whilst they come up with "standards."
What am I on about this time?
When computing kicked off, the idea wasn't to make it complicated. Indeed, the complexity of early machines was solely down to their limitations and during the micro computer revolution, great efforts were made to make computers usable by everyone. They were pitched at families and came with great manuals which could get anyone started with programming - this is exactly why BASIC exists and where it came from. It was indeed basic and meant that virtually anyone could knock up a simple program in next to no time, and thousands of people did knock up the odd utility or program and didn't need, or want to, consider themselves "a programmer." They just used the facilities of their machine and it was just another useful thing it could do.
Then there were the dreams of what might be. One popular prediction used to be that there would come a time when computers would program themselves. We would simply give a set of parameters and it would go away and generate the code for us.
We now know that this is a wildly complex task and unlikely (but not impossible) to ever happen. However, it raises this point - the idea was always that computers were designed to be appliances.
I'm sure you'd all agree that when washing machines were invented, they were a huge leap forwards in technology and convenience. Over time, they too have developed. Have they become less accessible? Have they become more convoluted and complex? Absolutely not. It wouldn't be acceptable to expect someone to become an expert in Washing Machine Mechanics and Operational Skills before they could clean their clothes and everyone would just buy another brand or product.
So why do we suffer this in computing? It's a total absurdity.
Why did Apple blow the phone market out of the water and kill Nokia, a global giant, literally over night? Was their product technically more capable than anything that had ever appeared before? No. Did it contain new technology that had never been seen before? Certainly not. What they had was simplicity. In bucket loads.
But this isn't about hardware, or software. Don't get me started about software - how did it take one company to show everyone that the idea was to simplify everything and then you'd sell so much you could literally Scrooge McDuck into your profits?
No, this is about programming languages.
Without languages, we know there is no software, there is no computing. In the beginning, there was machine code - and we saw that it was mental. Then there was assembly, and we saw that it was efficient, but also mental. Then there were primitive low level languages, and we saw that they were human to mathematicians with an unhealthy love of the Greek alphabet. Finally, someone noticed that it'd be nice to be able to talk to a computer in something that resembles, oh I don't know, the language we speak?! And lo, we saw that it was good.
Then some crazy people said, "wait a minute, maybe eventually we'll be able to tell computers what we want to do pretty much in plain English, and it'll be so easy everyone will be able to make their devices do whatever they want! It's the pinnacle of computing!!"
and the masses doth reply, "why, this is quite something! Surely this is the most sensible idea we've ever heard!"
and all this time, the geeks were mumbling to themselves and grumbling to each other, for they were unhappy that their baby would be taken away from them and given to people who see sunlight on a daily basis and aren't worried they may be soluble in the shower. "Burn them! Witches!" they cried, and the dream of easy programming died.
If you think I'm joking, look in to it. Programming got progressively easier and the 1980's and 1990's produced a generation of computer literate, code savvy users who are responsible for the products we all use today.
Then languages just became more and more convoluted. Paradigms were introduced that seemingly exist only to satisfy the sadistic urges of a very select group of technical morons who will bang their "but it's convention and correct practise" drum until someone sensible equips them with roller skates and nudges them down a greased flight of stairs whilst wearing teflon jackets.
Now we have a situation where even governments are in new trousers mode because we are facing a situation where there aren't any developers any more, because "true" computing (read coding) is once again the realm of those who are blessed with beard and everyone else can just scratch their heads and dribble a bit. Why do you think the BBC just spewed out thousands of "micro bits" to schools across the country in a desperate attempt to get kids "enthused about coding."
Don't get me started about those either.
Whilst everything else has moved forwards at a truly astonishing rate, programming languages have the equivalent of an obesity epidemic. They've stopped exercising, stopped trying to better themselves and instead have just got fatter to the point where they'll soon have a government ad campaign targeted at them to show just how bad for your health they are.
In 2016 it is unthinkable that we still battle with programming languages that will throw a fit if:
I spoke with a real world developer today who told me, in a way where he expected me to be aghast with awe, that he'd spent 8 hours trying to find the reason why his latest project didn't work.
Do you know what it was?
He'd typed a word in lower case.
He told it like a veteran recounting a tale of single handed bravery in the field that would make Arnie look like a blubbing wimp. All I heard was "I program in a language that normal people would question the sanity of."
Forget computing for a second, imagine I told you that if you were at work one day and you made a spelling mistake, it would not only stop you progressing, stop you working for an entire day, but it could potentially mean that your work would be totally broken, you'd suggest I'd gone slightly mad and should perhaps go for a lie down in a dark room.
Yet the absurdity is programmers accept this on a daily basis. Not only that, they embrace it as some kind of weird comfort and use every triumph over these arbitrary rules, set by sadist nut cases who sit on standards committees, as some kind of proof of personal triumph, of their growing ability to beat the system!
This is all wrong. How did we get to the point where the greatest invention in human history has once more become an incomprehensible mess, accessible only by those who invent it or bask in pointless, needless problem solving.
It doesn't need to be like this. I understand there are situations where you need total control, where you need to know that every clock cycle executes instructions in exactly the way you expected, that we have totally safe and predictable systems. But if you seriously think that day to day developing needs to be as difficult as it is, then you have a screw loose.
By now, we should be moving to a place where programming is a pleasure. Where development environments literally enable you to design lovely interfaces, describe behaviours and the code is largely written for you and when you actually do need to dive in, it's as simple as writing some structured sentences - and I mean sentences, not object.property.attibute.arbitrarypointlesslibraryfunctioncall(lotsofpointlessparameters)
If your language cannot be picked up by a 10 year old, then I don't want to know. I learned to program at the age of 7 and this was only possible because basic was so simple and so forgiving that you could just get straight in to the good stuff - making your ideas come to life, trying to make your imagination appear on a screen. It was brilliant, because the language didn't stand in your way. Now, I feel sorry for anyone trying to learn any industry standard language.
Finally, the biggest abomination I have ever witnessed is Greenfoot. Anyone who believes that Java is a language suitable for small children is a stripy blanket short of a picnic. When it is acceptable to use conventions where you name objects with exactly the same name as their class, you have a problem. "Hey, I know, lets write deliberately confusing code that will introduce errors without people even trying! Lets make it harder by only differentiating between two very different things with only a capital letter! lololololololololol"
That's an actual line of Greenfoot/java. Imagine you've never programmed before. That's really intuitive isn't it? As intuitive as stabbing yourself to death with a goldfish. Sort it out, developers, because this was never the dream.