What can we learn from software history?
On August 24th 1995, Microsoft released Windows 95 and it was absolutely mental. The world, quite simply, had never seen anything like it. It was Microsoft's iPhone moment and did exactly for them as the iPhone did for Apple - made them the biggest company in the world. Microsoft were richer than oil companies, banks and... some countries.
People actually stood outside shops in huge queues at midnight to buy Windows 95 (and you thought this only happened for Apple products). They even hired the cast of Friends to make an hour long instructional video on how it worked. Its hilariously bad and you should watch it by clicking here...
Microsoft supplied people with packs so they could have Windows 95 launch parties. Seriously, they did.
The Rolling Stones even made a reported 8 million dollars so Microsoft could use one of their worst songs as the music to the TV advert.
You may not have had a PC in your home in 1995, I certainly didn't, but regardless you really did know that Windows 95 was being released. It was, quite literally, everywhere.
You don't really need an iPhone or an Apple Watch, but it's highly likely when you didn't have one that you wanted one anyway. This is the genius of marketing, selling us things in such a way that even though we don't need one to live our lives, we want one anyway because we truly believe it will change the way we live. This is exactly what happened with Windows 95.
I can remember vividly talking to the one and only teacher in my secondary school that understood computing (he also ran the network) about it. We were about to get about 50 new computers, Pentium 166MMX with 32mb of RAM and Windows 95! I was excited and I didn't even understand why. As an aside, this was at a time when the technician at my school wore a white lab coat, like computers were some kind of secret, magic scientific experiment. The whole setup was quite amusing looking back.
Computing certainly wasn't new in 1995, but really there wasn't any kind of defined, world wide accepted standard. Without standards there will always be a struggle to educate people - which platform do you decide to learn? Which one should you use?
Computers were also still eye wateringly expensive, easily £1500 for a basic model. Most computers of the time were 486 machines running at about 66 - 100Mhz with 4-8mb of RAM. Windows 95 needed 16 to run relatively smoothly and this was a big deal because 8mb of EDO RAM at the time was a significant chunk of cash...
Consequently, PC's weren't really that popular in homes and you certainly didn't need one to get by, do your homework or browse a fledgling internet. Have a look at this graph from the National Statistics Office:
The graph shows the rise of home computing that I've talked about in a previous post, but it was relatively steady and quite stagnant until 1995, then look what happens - very suddenly the uptake of home computers goes through the roof. It's interesting as an aside to note that even in 2010, 20% of households reported owning no computer equipment, which is staggering when you think about it, but also goes to show why you shouldn't presume everyone has access to technology.
This up turn in the graph is, quite simply, the "Windows 95 Effect." Suddenly, because of a piece of software, you realised you needed a PC.
Microsoft had realised that computers were hard to use and needed to be simplified and this thing called the internet was coming...
Take a look at this:
This is Windows 3.1 - the predecessor to 95. I seriously doubt you'll ever have used it - French air traffic control still do, but that's another story. If you want, you can actually try it out in your web browser here and get a feeling for how awesome owning a 486 was back in the day - click here to try it out.
Windows 3 itself was a huge leap forward, but the world didn't take much notice. It wasn't great, the idea was to make DOS a bit more bearable. It was 16 bit (limited) and crashtastic. Even so, it was far more user friendly than a command line interface and allowed more people to just get things done, which is the whole point in having a machine - right?
Why did people get so excited about Windows 95, then?
Well, because it genuinely was a huge, huge leap forwards in so many areas. Like the micro computer, Nintendo Wii, iPod, iPhone to name a few, it did something truly new, truly revolutionary that changed the way people looked at hardware and, also, changed their expectations forever.
The Wii sold millions of units to older people because it was just so easy to use and found a niche in exercise and other physical games. No one had ever opened this door before. The iPhone was the same, suddenly people realised what a smartphone should be, but more than that, it introduced a whole new way of working that was so simple anyone from a 2 year old to your grandmother could use it. Again, this was all new.
The things that make a piece of software, or indeed hardware, such a big deal are surprisingly simple:
The only problem is, these things are actually incredibly difficult to achieve and are often reached by accident or curiosity rather than by design. Take Google, Larry Page and Sergey Brin just joked about what would happen if they tried to download the whole internet, then they did. Then they changed the world and became billionaires.
Windows 95 introduced the following ideas:
Every single one of these was a big deal on their own, put them all together in one go and you begin to understand why it was such a monumental event.
Don't get me wrong, it wasn't perfect. You would have to reinstall it almost weekly and it would crash if you even thought about doing something it didn't like. Sometimes it would just crash because it was bored.
However, fast forward 21 years and what's different?
Those fundamental design ideas haven't gone anywhere. In fact,, the start menu was such a big deal that when it disappeared in Windows 8, they had to bring it back again.
Of course, there have been improvements, least of all is stability. When Windows 2000 came along it was a revelation - you could leave your computer on all week and it would still work, which really was something of a novelty. Windows 98 was so buggy it needed fumigating and Microsoft released a second edition very quickly before users dropped it in their thousands. Windows 2000 wasn't even meant for home users but thousands made the switch when they realised how good it could be. Oh, and they'd upgraded their RAM again, of course!
Windows has reached and passed its maturity stage and is now in decline. No longer is the desktop or laptop the computing device of choice and you are far more likely to use Android or IOS to do your daily web browsing or YouTube watching. No longer is each release a big deal. I stopped getting excited by new versions of Windows at XP, especially when Microsoft planned a stunningly brilliant release (Longhorn, if you're interested) and then ditched literally all of it and released Vista, which is the most appalling piece of software ever written. It was at that point (2005 ish) that Microsoft lost control of the computer market.
Why did this happen? Well, its typical of any product lifecycle, but ever more so in computing. Around once every 10 years something amazing comes along which is then adopted by world + dog. Then begins a constant cycle of upgrades that don't add a great deal other than colour changes, new icons and a feature or two, but nothing which really blows your socks off after the third version (think about what's amazing from iPhone 4 onwards - not a great deal). Then begins a decline in interest before the next big thing.
In our recent past there has been Windows, Apple, Google and Facebook. The biggest problem for all of these is that they're mature, they're out of ground breaking ideas and no one, not even these huge companies have the faintest idea what the next big thing will be - they guessed at smart watches and that hasn't happened yet. The interesting thing is, whatever it is has just been thought up by someone and is about to happen...
So what can we learn from software development? From great ideas comes great software (given a chance), then comes improvement, refinement and mass adoption. Then... Well, inevitably they either miss the boat completely or ruin it. Look at Microsoft Office, I'm willing to bet every single one of you could use Office 1997 and have no issues at all. Microsoft are so desperate now their idea of a new version of office is to give you your colours back (why did they think we all want to work in dull tones of grey??) and change the menu text from lower case to all upper case. There's progress for you.
Not to mention Jonny Ive's appalling IOS7 which saw him take a set of highlighter pens to everything.
Everything you currently know will be nothing in 10 years time. What comes next... well, I'm excited to find out, it's time for something new and brilliant.
It shouldn't ever surprise you that technology depreciates almost as quickly as a car. It's not as bad today as it used to be, but any computer, laptop or console you buy will gradually decrease in value until it's worth a few quid or simply gets consigned to landfill.
Why? New models come out, technology improves, standards change and equipment simply becomes obsolete - it can no longer perform the tasks we ask of our equipment and has no place in a modern home, office or school. Hence, 5-6 years after a purchase, most technology ends up in the bin or on its way to China or India where some poor worker will contract cancer from recycling (or just burning) our e-waste.
Only, something surprising is happening and it is making me wish I'd never thrown my old computers away...
I was fortunate enough to grow up in what is now viewed a golden age of computing. The micro computer revolution had happened and Bill Gates had just made his famous prediction that "in the future I can imagine a PC being in every home." Everyone thought he was clearly a bread based good short of a blanket seated dining experience. Why would you want a computer, what would you even use it for?? Now look around you - smart watch, smartphone, tv set top box, smart TV, games console, laptops... all computers, all seen as throw away devices.
In 1979 a revolution began, fueled by three companies - Commodore, Acorn and Sinclair. Unfortunately, if you read many accounts of computing history, you'll read that Apple were the catalyst at this point and its simply not true - Apple were on their knees with terrible products, high prices and non existent sales. Indeed, it was Commodore engineers that helped Steve Wozniak fix his broken Apple II design and the sheer stupidity of Commodore executives when they ignored the offer of exclusivity when VisiCalc came along (arguably the most important software release of all time) which meant Apple survived to live another day. If Commodore had taken on VisiCalc, I'd be confident to suggest Apple would simply have gone out of business or been taken over.
Anyway, this was a time of the home computer - the micro computer. For the first time ever computers were no longer the domain of large business, banks, governments and universities. Anyone could go out and spend £300-700 and bring home a computer which didn't fill a room and require a new power line to be run to their house.
Weirdly, when products like the Apple I and Commodore PET were introduced people literally didn't have a clue what to do with them. They were seen as novelty items for geeks and electronics enthusiasts. There were no killer applications at the time for home use and so they sold in relatively low numbers to home users. This is the same position the smart watch market finds itself in now - a select few buy them, some buy them for fashion reasons, but no one can say they are an essential item to own, unlike a mobile phone for example.
Rapidly, though, a few things happened. The price of these devices began to fall, faster and more capable chips were developed and an ever growing group of developers began to make useful software and entertaining games. It wouldn't be too far fetched to suggest that games developers were largely responsible for the rise of computing in the home. It was an absolutely fascinating time, no one had ever made a computer make noise, let alone music before. Nor were graphics really understood, and you could count on one hand the number of chips that had been designed that could output a decent image to the screen (again, Commodore/MOS engineers had led the way for years in this field). Imagine living at a time where computers may as well have not existed and then suddenly, in the space of about 5 years, these devices started to appear that allowed you to automate tasks, help you in your job and, given an imagination and some programming skill, enabled people to make worlds of their own in software. It was incredible.
A generation had learned to code and interact with machines what were quite frankly under powered, difficult to use and very fickle. You'd have everything saved to cassette tape or 5 1/4 inch floppy disc if you were rich (or a school/business) and one slight blip in the power supply or you sneezed at the wrong time and they'd crash all over the place and you'd lose everything you did. Hard drives were the stuff of imagination and dreams.
Then came the next wave - IBM PC's and games consoles. Yes, Atari had made some awful consoles in the 1970's and 1980's but now came the time of Nintendo and Sega with a wave of innovation and creativity that was quite simply staggering, and nothing would ever be the same again.
In the late 1980's and early 1990's there was still a lot of skepticism about how useful a personal computer could really be in the home. Don't forget there was no mass adoption of the internet at this point (it was hideously expensive to connect in the UK through your phone line) but people were starting to see how the things they were doing at work could be of use at home. A generation of kids crossed their fingers and hoped the answer to their homework was on the Microsoft Encarta CD-Rom (an encyclopedia) so they could copy and paste it... As a side note, Encarta was a big deal back then and the idea that an encyclopedia could be fitted on to a single disc was quite simply staggering. I remember thinking it was fairly incredible myself, but I didn't have a PC by then so it was of little use, homework would have to be done out of, wait for it, a book!
What people weren't unsure of, however, was how good computers could be for playing games. The game market had gone mental and products like the 8 Bit Nintendo Entertainment System and the Sega Master System had given people their first taste of Mario, Zelda and Alex Kidd - future classics! These machines were no more powerful than the computers available at the time (in fact quite less so in some cases) but the idea of just turning it on and playing was novel and brilliant. They were, quite simply, awesome.
I mentioned these were 8 bit systems. People knew exactly what this meant back then. When you bought a console, home computer or a PC (if you had a spare £5000) they came with manuals that were reminiscent of phone books. Old phone books. Oh god... you've probably never seen a phone book have you?!
When these books plonked out of the box on to the floor, people didn't ignore them. They had no choice. This is what you got when you turned on your £5000 PC:
So what do you do? You read the book! As a result, users knew exactly how their computers worked, why certain things happened in a certain way, where their files lived, how to increase the performance of their system, the importance of sensible file system organisation and so on. They learned how to manipulate things efficiently, write batch files that did common tasks for them and they could truly tailor their computing experience to their needs. Something that is sadly not really known about any more, except for those with an education in computing. Not everything gets better with progress!
The 1990's, then, saw arguably the greatest era of development in the history of computing so far. Prices dropped through the floor as PC manufacturers fought out who could release the fastest machine for the least money. Intel made sure that your computer was obsolete within 6 months.
This was no joke. The year the Pentium 2 was released, CPU speeds went from 366Mhz to 800Mhz in one year! This was crazy. At the time, an increase of 33mhz was noticeable. To spend £1500 on a PC that was basically landfill in 6 months was just unthinkable - but it happened.
At the same time, a golden era of console gaming happened. Nintendo released the SNES and Sega the Megadrive. A whole generation enjoyed the growth of Sonic the Hedgehog, Super Mario, Zelda, Starfox and so forth. Oh and a little known company released a terrible football game called Fifa Soccer...
These were 16 bit machines - twice the processing power than their predecessors and, who'd have thought it, ergonomically designed controllers! Nintendo even learned from their mistakes and designed a console you couldn't balance a drink on top of after about a million NES consoles went bang after an excitable child dumped a glass of milk over the electronics.
Ten years ago you couldn't give this stuff away. Got an old 486? Pentium 166? Pff, not even worth the scrap metal. A Super Nintendo?! Put it in the bin!
Oh how misguided we were.
I owned some machines that are now so rare it makes your eyes water how much people are willing to pay for them. I once owned a Commodore 128D and gave it away. Good ones now sell for around £300. Commodore 64 boxed? You're looking at £50-100.
486 DX4 with monitor? Up to £300.
SNES in boxed condition? They're going for at least £150.
These prices might not seem that high to you, but consider this, 5-10 years ago the same devices would've sold for £10-20 if that. You could easily pick one up for free if you looked in the paper, people wanted rid. Companies literally hired skips and threw thousands of 486 and pentium class PC's away. Now, mark my words, the prices of these machines is only going to go in one direction, and that isn't down. As more and more people want a piece of their past, less and less machines are available as they break, get thrown out or held on to by owners who realise their value. As with anything, sooner or later everything becomes history, and history is valuable! Even though millions upon millions of these devices were made, because of our disposable attitude to goods and the fact electronic devices can and do break, very quickly numbers become sufficiently small as to create value again.
Now, the tide has turned and people actually want these machines. Yes you can emulate and use modern hardware but its the experience of ownership, the reminiscence of a lost youth and the feeling of using the original kit that is driving this market. "Retro" gaming is taking off and is about to get seriously huge. As you grow older, suddenly you can afford all the stuff that was way, way out of your league when you were young. Imagine the system/console you dream of - in 5 years time it'll be pennies - buy it.
If I could give you one piece of advice when it comes to any console, PC or similar it would be KEEP IT! Keep the box, all the packaging, the lot. you stand a chance of it being worth a fortune in the future. If you don't sell it, you never know, the memory itself may be worth keeping it for. Oh and if you have any old Apple kit, its becoming more and more valuable - people are buying it for the design alone.
Computers, it seems, are the antiques of the not too distant future.
"This is boring. I don't care about computers!"
"Why do we need to learn about computers, anyway?"
Good question. Why learn English? Indeed, why learn anything...
If you asked me why we had to learn English, especially Literature, at the age of 15 I'd have said something along the lines of "to keep the company that makes ink cartridges in business." However, with a little more thought it's blindingly obvious why studying English, and more to the point - thoroughly embracing it, is such a big deal.
Most people can converse without too many issues and may never feel the need to write or express themselves in any way other than the ordinary. Clearly, though, those who learn to master the language, to take control of it, inevitably benefit from their intricate understanding and can find themselves in position where it make a big difference.
Consider you were sifting through job applications; would you be impressed by the applicant that seemingly has no grasp of basic grammar, makes spelling mistakes and clearly struggles to structure their argument, or are you going to be caught by the person who writes strong, flowing arguments for their right to your new position?
The answer is obvious, isn't it?
Computing, like English, is something we all have in common and all make use of on a daily basis. Just like English, pretty much everyone can use a computer or device (my son mastered IOS devices at the age of 2) but few realise that the true potential and power is in being able to control computers - and to do that, just like language, you need to understand its structure, how things work together, what makes it tick, the rules concerning what you really can do, and what's in the realms of fantasy.
Interest no longer comes in to it. Whether we like it or not, whether it's right or wrong, people form judgements when they observe poor use of English or those lacking in a basic grasp of grammar rules. It defines you as a person - how you communicate sets the tone and gives people a real sense for who you are. The same is now true of technology, it is quite simply impossible to exist in modern society without interacting with some form of machine. The bottom line is, an education in technology prepares you to be literate, to be conversant in a language that is so often misunderstood by so many. Take it a step further and it may be the difference between you working with machines, or being replaced by one.
Do you like the idea of being lied to or ripped off? How about being totally helpless when something goes wrong, with something you rely on, daily?
There is a classic analogy here of the person who buys a car, has no idea how it works and when something goes wrong with it relies on the garage to fix it for them. What's wrong with that?! Well... How do you know they're being honest with you? Is your idle problem really so bad that you need £1000 of work carried out, or do you simply need a new mass air flow sensor which costs £40, you could fit yourself in 30 minutes and just happens to be sending the car computer round the bend?
An honest establishment is going to tell you the truth, obviously. You've all heard the horror stories, though, or watched the TV documentaries of people who go for an oil change and come out with a huge bill for work that was unnecessary and sometimes wasn't even carried out.
A person with a basic knowledge of mechanics is going to smell a rodent a mile away and take their car elsewhere. Better still, they'll probably have a bash at fixing it themselves and rely on "professional" help only when it's a big or safety critical job.
This needs to be you! Regardless of whether you like computing or not, it is now necessary to have a basic understanding of what computers are and can do, just as you should understand your car when you learn to drive so you don't buy a lemon and end up out of pocket or worse.
Take a look at this extract from one of the most horrendous high street technology retailers:
What does all that actually mean? Why should you buy the more expensive model? Or should you? What's the difference between a Celeron and an i3?
Their TV adverts are horrific.
"This new laptop with 8GB and a dual core processor for only £399!"
What are they talking about? It's classic, just like car sales, bigger numbers must surely be better! So if they just quote numbers at people, its really easy to drive sales.
"This one has more gigs, so it must be better!"
So are you better off with a 1TB hard drive, or a 512 SSD? 1TB is bigger, so that's better, right?!
If you're taking GCSE or A-Level CS, stand around in any PC retailer for a while and listen to the conversations people are having. It's eye watering how little most customers know and the sales staff love it. Worse, the sales staff often know only marginally more than the customers do. I've heard some horrific and quite obviously terrible advice given in shops by staff that haven't a clue about what they're actually selling.
A little knowledge goes a long, long way. In technology and computing, it could save you thousands in your personal life and, who knows, in your professional life.
Its better to know what's going on around you and to be in control than to be led, or worse, misled by others.
That's why you need to know about computing.