If you're not already making use of Office 365 and your 1tb (that's a lot) of online storage, then you should be.
All you need to know is your school email address and normal school password. Then you can log in at the office 365 portal here.
Reasons you should be using it:
Watch the video below for more information.
I've recently been developing an application at school which pulls class data out of our management system and spits out an editable seating plan. It's meant to be an easy to use, press one button and don't worry about it job.
After a few evenings of programming I had something that worked fairly well and got to the point where I needed to test it with some different data sets to make sure it didn't throw its toys out of the pram when presented with unexpected information.
I got to a point where I thought it was pretty much finished and off I went down to the technicians office to share what I'd been doing and get them to test it out for me. So, there I was, clicking away, showing how you could go back, change your mind, change options, load a different data file and so on and it was all working as it should.
No. I lie. As is so typical in computing, what works perfectly for you every single time will inevitably die in a hideous fashion as soon as you present it to anyone. Its like oranges and bananas in your fruit bowl - they're fine until you walk out of the room, at which point they immediately go soft, brown or gooey.
I fired it up, loaded the data file and showed how it works. At which point I changed one of the options and suddenly all the data disappeared. I then uttered the words that all people who work in IT say when they have absolutely no idea what's going on - "that's interesting" which translated means, "I've broken it and have no idea how, why or what I'm going to do about it."
If you were in my class I'd now suggest that you bung a break point in the program at a sensible point and then step through using the debugger to find out what's going on. So, once I'd got to the point where I could replicate the error every single time, this is exactly what I did, and this is where it gets weird.
You can switch off now if you're not interested in the beardy bits...
The data in my program is stored as a list of objects, which when a button is pressed is then passed to another form which then rearranges the data into the order the user selects and then fills the form in the layout requested. None of the code is destructive and once the form is closed, the data is re-sent from the main form if another plan is required.
What was happening was at some point the list of objects was being cleared - which should be fairly easy to pin down. So I started following the program and watching the contents of the list at each stage.
The code would run first time without any issues whatsoever, but click the button a second time and this time when a certain sub routine returned, the list would be empty. The strange thing was that at the end of the sub the list was populated, but on returning was cleared.
"it's a byref, byval problem!" I can hear you say. "Think about the scope!"
I'd agree with you, but if that were the case the list would be cleared every time. Nothing had changed, the exact same code was being run. First time, fine. Second time, not fine.
Scratching around for a solution I then commented out one line of code. This line was before the list was finally populated at the end of the sub routine and the debugger had shown worked fine.
It worked. No bug any more.
This was mind bending for so many reasons - the line didn't empty the list on the first run through, didn't empty the list when debugged and wasn't the point at which the list was disappearing in the debugger - this was happening 3 or 4 statements later! But yet... it fixed the problem.
This happens in programming sometimes. You can do something that absolutely shouldn't work or shouldn't have any effect on anything and yet you'll get bizarre results. I'm still lost as to why this fix worked, but considering my code isn't going to risk anyone's life, I think I'll take it and walk away...!
Inevitably, this can happen in class as well - try explaining that one to 25 perplexed students...
How many of you can type at a reasonable speed, say around 50 words per minute? If you're not sure, test yourself by clicking this link. I tend to average around 65wpm with text that I don't know, higher if I'm typing original text. My typing has actually slowed in recent years, when I was at university and probably doing the most typing I've ever done, I averaged around the 80wpm mark without too much trouble. As with anything its down to practice.
But why is it important? The fact is, to be productive you have to be able to type. No one, despite millions being spent on research and development, has come up with a better method of text entry than the qwerty keyboard. Sometimes it's just the case that an old design is... the right design! Take a look around a Year 7 classroom (and later years, I'm not singling them out) and you'd be shocked at how poor the typing skills of students are - it's not unusual to find students that only use one finger to jab at the keys in an incredibly painstaking process that takes, unsurprisingly, forever.
The technology of each generation does tend to dictate the set of skills they end up either consciously or inadvertently learning along the way. If you go way back, it was usual that students learned to take notes in short hand, or touch type (no looking at the keys, ever). Move forwards to the 1990's and for well over a decade people round the world suddenly became amazingly adept at typing because of the growth of instant messaging over the internet. It wasn't uncommon to have an AOL IM account, Yahoo Messenger and MSN messenger all running at the same time, with at least 6 conversations on the go all at once, with the taskbar flashing like a police car at you, demanding your attention. Every young person at the time knew how to alt-tab their way around each box and hammer out messages at well over 100wpm without thinking - they learned to adapt because you had to!
Strangely, that seems to have been the golden age of communication online. Since that time all the messenger services have died off for one reason or another and nothing really has replaced them. In that time, we've had the advent of touch screen devices and a real shift away from traditional PC use, meaning that today's students simply don't use computers in the way people of an older generation are used to, or expect.
This is actually causing quite a problem. I asked a class of Year 7 students recently, "how many of you use a desktop PC or laptop at home on a regular basis?" Approximately 3 hands went up - 10% of the class. At first, this surprised me. We asked other classes and the numbers never rose much above 20% of students in each group. This goes a long way to explaining some of the problems we have when students arrive at secondary school, especially the length of time they take to adjust to our way of working, but particularly to how a computer works. The number one problem used to be that students could not understand the idea that one "drive" was theirs and another was shared on a network and they can't save to it. Now it's that coupled with the fact... students don't really know how to use a traditional computer at all!
Why was I surprised, though? My own desktop computer sits in a cupboard at home gathering dust, not because I'm not interested in computing any more, but simply because there isn't a need to use it. Desktop PC's are anti social machines that require you to sit usually out of the way and, these days, don't actually offer any compelling functionality unless you're doing tasks which require a powerful machine. This is compounded by the fact I now have a very portable and powerful laptop in the form of a Surface Book which means I have even less motivation to use a desktop, but... I still use a PC every day!
So what are younger people doing? They're using tablets and, predominantly, phones. Why wouldn't they? The internet is available in its entirety in their pocket, they can communicate and listen to music, share experiences and have everything they need in one place. They're extremely adept at using these devices and are quick to share new applications, methods of working or just things they think are interesting. In this way, they're the equivalent of our MSN generation, they've learned to use the technology available to them and adapted to it perfectly.
The outcome of this is, strangely, that we have in many ways returned to the early 1990's where there is a about to be a real need for the teaching of "ICT" skills again. Students are brilliantly aware of the technology that is prevalent to them, but woefully ill equipped to deal with a working world that they will move in to. Yes, I acknowledge that touch screen/tablet technology will become more acceptable in business, more mainstream and more integrated with traditional machines, but the bottom line is, to do decent work you are going to need to be adept at using a mouse, keyboard and desktop operating system. No one has come up with a better method of working and I cannot see this changing in the short to medium term.
One thing you can be certain of, however, is that these technologies will merge. Look back and you see the perfect example being modern smartphones. Where once you needed a computer to browse the "full" internet, a music player to store your music and a phone for communications, they are now all one device. Laptops are already changing to have screens which detach to become tablets but no one has found the perfect medium yet and may never do.
Apple have spent millions on research into touch screens and traditional computers and Steve Jobs, before his death was adamant there was no way that touch screens worked on traditional computers, the two just don't match, it's interesting, then, that Apple are now desperately trying to sell iPads with keyboards and touting them as "computers." The fact here is that Microsoft may well have accidentally gone down the right path after all with Surface in that it's a fully featured PC in a convenient form factor. Only it's absurdly expensive...
Anyway, the bottom line is, to be productive we need typists. We need people who can work across a range of technology and, sadly, right at the time when our students need it the most... we've killed off ICT. Nice work, CAS!
Love it or hate it, people young and old are going nuts for a bit of Pokemon Go. On the face of it, it all seems rather pointless - you walk around to find characters and then flick them a bit. But there is much, much more to it than this and it's heralding a new era of games that will completely change how people think of gaming.
Since the release of Pokemon Go, the value of Nintendo has risen by over $7bn as people have been rushing to buy shares in the company. Why? Well, investors in general are not stupid people, they're responsible for vast sums of money and amongst other things, our pensions, so nothing too important. In simple terms, they're gamblers who have unlimited amounts of someone else's cash to waste. Sounds good to me.
So why have investors been betting on Nintendo, or at least showing strong support for a new concept? The reason is because this is the first successful example of mass adoption of Augmented Reality and the potential market is absolutely massive - big emerging markets are like fairy dust, Christmas, rainbow riding unicorns and pots of gold at the end of very large rainbows to investors because they can bet low and make absurd profits when it all goes mental.
Augmented Reality is in it's total infancy at the moment and will either surpass or at the very least compliment the very best virtual reality. The fact is that augmented reality is more accessible than VR, has a much wider range of opportunity than VR and solves all the problems of VR without trying - you don't need to stay in your house/one location, you're not going to break your nose running in to a wall or smash your house up because you can't see it and it won't make you redecorate the place with vomit because of motion sickness.
Then there's the potential, not just creative but social also. People are already getting really excited that technology may actually hold a solution for the one big problem it has helped to create - a distinct lack of movement. Now people are running around town, chasing imaginary characters (a wonderful image of a crazy society if ever there was one) in an effort to "collect them all" which, whether you like the game or not, really has done more to get people moving than the sales of millions of Wii Fit boards that now gather dust under millions of TV stands ever did.
The social aspect is huge as well, you're now very likely to bump in to a fellow Pokemon player who is after the same things as you, the potential for unintended meetings are really high, which means social connections that would otherwise never have happened, now will. On top of that the idea that you can compete with rival teams to "take over" an entire town or region is utterly brilliant. Often the most successful ideas are the most simple.
Think about politics - parties spend millions trying to motivate people to get out of their houses and vote, to "take over" a town or city in the hope of a candidate becoming elected. The gamification of this idea has had drastically more success than political investment ever has or probably could and the potential to apply this simple competitive element to other areas is huge.
Finally, there's the very idea that you can simply take anywhere, any space, any area and turn it into the most awesome game you ever played. With advances in motion sensing and wearable computing it is not difficult to imagine playing a future version of Doom or similar where you literally walk in to an empty building complex, put on your AR glasses and suddenly you're instantly transported into an amazing game world where you're not a passive participant but actively in the game - your physical movements all translated in to the game. The level of involvement, emotion, stress and so on will be amplified to a level never seen before. All those empty, derelict buildings will suddenly gain a value they never had as the latest level in a game.
If you take it to the extreme you could find teams of people travelling all over the world to play in games in all kinds of countries (as you would now in Uncharted and other games for example, as you follow clues to an eventual goal). It has the ability to create an entirely new form of travel agent, one who will create your very own personal, world wide adventure of a lifetime.
This, ladies and gentlemen, is why a $7bn increase in company value is just the beginning. Done right, this has the ability to make companies like Apple suddenly look like a corner shop.
Computing suddenly got very, very exciting again.
If you work in computing or study it for any length of time, you'll find that the more beardy someone is, the more they embrace and promote unnecessary complexity and the more they frown upon those who would rather live their lives than debate the relative merits of type safe variables.
In many ways, computing is a stunning example of the outrageous pace at which we have managed to develop technology which has changed, and continues to change, the world. In others, it's a frustrating reminder that basically a unique club of furry toothed geeks have managed to continue to keep the doors firmly closed to mere mortals for far too long and shows exactly what happens when you let a group of "unique" people shut themselves away unattended for a significant period of time whilst they come up with "standards."
What am I on about this time?
When computing kicked off, the idea wasn't to make it complicated. Indeed, the complexity of early machines was solely down to their limitations and during the micro computer revolution, great efforts were made to make computers usable by everyone. They were pitched at families and came with great manuals which could get anyone started with programming - this is exactly why BASIC exists and where it came from. It was indeed basic and meant that virtually anyone could knock up a simple program in next to no time, and thousands of people did knock up the odd utility or program and didn't need, or want to, consider themselves "a programmer." They just used the facilities of their machine and it was just another useful thing it could do.
Then there were the dreams of what might be. One popular prediction used to be that there would come a time when computers would program themselves. We would simply give a set of parameters and it would go away and generate the code for us.
We now know that this is a wildly complex task and unlikely (but not impossible) to ever happen. However, it raises this point - the idea was always that computers were designed to be appliances.
I'm sure you'd all agree that when washing machines were invented, they were a huge leap forwards in technology and convenience. Over time, they too have developed. Have they become less accessible? Have they become more convoluted and complex? Absolutely not. It wouldn't be acceptable to expect someone to become an expert in Washing Machine Mechanics and Operational Skills before they could clean their clothes and everyone would just buy another brand or product.
So why do we suffer this in computing? It's a total absurdity.
Why did Apple blow the phone market out of the water and kill Nokia, a global giant, literally over night? Was their product technically more capable than anything that had ever appeared before? No. Did it contain new technology that had never been seen before? Certainly not. What they had was simplicity. In bucket loads.
But this isn't about hardware, or software. Don't get me started about software - how did it take one company to show everyone that the idea was to simplify everything and then you'd sell so much you could literally Scrooge McDuck into your profits?
No, this is about programming languages.
Without languages, we know there is no software, there is no computing. In the beginning, there was machine code - and we saw that it was mental. Then there was assembly, and we saw that it was efficient, but also mental. Then there were primitive low level languages, and we saw that they were human to mathematicians with an unhealthy love of the Greek alphabet. Finally, someone noticed that it'd be nice to be able to talk to a computer in something that resembles, oh I don't know, the language we speak?! And lo, we saw that it was good.
Then some crazy people said, "wait a minute, maybe eventually we'll be able to tell computers what we want to do pretty much in plain English, and it'll be so easy everyone will be able to make their devices do whatever they want! It's the pinnacle of computing!!"
and the masses doth reply, "why, this is quite something! Surely this is the most sensible idea we've ever heard!"
and all this time, the geeks were mumbling to themselves and grumbling to each other, for they were unhappy that their baby would be taken away from them and given to people who see sunlight on a daily basis and aren't worried they may be soluble in the shower. "Burn them! Witches!" they cried, and the dream of easy programming died.
If you think I'm joking, look in to it. Programming got progressively easier and the 1980's and 1990's produced a generation of computer literate, code savvy users who are responsible for the products we all use today.
Then languages just became more and more convoluted. Paradigms were introduced that seemingly exist only to satisfy the sadistic urges of a very select group of technical morons who will bang their "but it's convention and correct practise" drum until someone sensible equips them with roller skates and nudges them down a greased flight of stairs whilst wearing teflon jackets.
Now we have a situation where even governments are in new trousers mode because we are facing a situation where there aren't any developers any more, because "true" computing (read coding) is once again the realm of those who are blessed with beard and everyone else can just scratch their heads and dribble a bit. Why do you think the BBC just spewed out thousands of "micro bits" to schools across the country in a desperate attempt to get kids "enthused about coding."
Don't get me started about those either.
Whilst everything else has moved forwards at a truly astonishing rate, programming languages have the equivalent of an obesity epidemic. They've stopped exercising, stopped trying to better themselves and instead have just got fatter to the point where they'll soon have a government ad campaign targeted at them to show just how bad for your health they are.
In 2016 it is unthinkable that we still battle with programming languages that will throw a fit if:
I spoke with a real world developer today who told me, in a way where he expected me to be aghast with awe, that he'd spent 8 hours trying to find the reason why his latest project didn't work.
Do you know what it was?
He'd typed a word in lower case.
He told it like a veteran recounting a tale of single handed bravery in the field that would make Arnie look like a blubbing wimp. All I heard was "I program in a language that normal people would question the sanity of."
Forget computing for a second, imagine I told you that if you were at work one day and you made a spelling mistake, it would not only stop you progressing, stop you working for an entire day, but it could potentially mean that your work would be totally broken, you'd suggest I'd gone slightly mad and should perhaps go for a lie down in a dark room.
Yet the absurdity is programmers accept this on a daily basis. Not only that, they embrace it as some kind of weird comfort and use every triumph over these arbitrary rules, set by sadist nut cases who sit on standards committees, as some kind of proof of personal triumph, of their growing ability to beat the system!
This is all wrong. How did we get to the point where the greatest invention in human history has once more become an incomprehensible mess, accessible only by those who invent it or bask in pointless, needless problem solving.
It doesn't need to be like this. I understand there are situations where you need total control, where you need to know that every clock cycle executes instructions in exactly the way you expected, that we have totally safe and predictable systems. But if you seriously think that day to day developing needs to be as difficult as it is, then you have a screw loose.
By now, we should be moving to a place where programming is a pleasure. Where development environments literally enable you to design lovely interfaces, describe behaviours and the code is largely written for you and when you actually do need to dive in, it's as simple as writing some structured sentences - and I mean sentences, not object.property.attibute.arbitrarypointlesslibraryfunctioncall(lotsofpointlessparameters)
If your language cannot be picked up by a 10 year old, then I don't want to know. I learned to program at the age of 7 and this was only possible because basic was so simple and so forgiving that you could just get straight in to the good stuff - making your ideas come to life, trying to make your imagination appear on a screen. It was brilliant, because the language didn't stand in your way. Now, I feel sorry for anyone trying to learn any industry standard language.
Finally, the biggest abomination I have ever witnessed is Greenfoot. Anyone who believes that Java is a language suitable for small children is a stripy blanket short of a picnic. When it is acceptable to use conventions where you name objects with exactly the same name as their class, you have a problem. "Hey, I know, lets write deliberately confusing code that will introduce errors without people even trying! Lets make it harder by only differentiating between two very different things with only a capital letter! lololololololololol"
That's an actual line of Greenfoot/java. Imagine you've never programmed before. That's really intuitive isn't it? As intuitive as stabbing yourself to death with a goldfish. Sort it out, developers, because this was never the dream.
Select.... From.... Where
Select fields From table Where criteria
SELECT... FROM.... WHERE....
A thought occurred to me after I wrote yesterdays post about neural networks/machine learning and that is:
If you recreate a brain using computer hardware and software, you have several problems:
This raises a whole world of legal, moral and ethical issues to which I don't believe there are any perfect answers. Just a few initial thoughts from that list above make things even more complicated...
You would (will) be living in a society where technology and AI have finally evolved to a point where machines are capable of making decisions. Indeed, as most things will be automated, they will be machine controlled. The advantages of having AI control of machinery are numerous and so it's likely that even menial machines will have some form of intelligence. As an aside, if you haven't watched Red Dwarf, you really need to see the episode where Lister meets Talkie Toaster:
Imagine if those awful self service checkouts actually had some intelligence and you could talk to them? This kind of technology is coming, the beginnings are already available today and history tells us that these things only get better and better with refinement over time. Things like Siri voice control will be as normal and ubiquitous as handles on doors.
So, if our machines can think, they can also very easily turn you off and if you've been turned off you're no longer in control.
I am fascinated by the nuances of our personalities, how our minds work and what makes us... us! Because we don't truly understand how we become who we are, we cannot predict whether a machine with the same neural capability will develop skills such as empathy, understanding, kindness and so on. Maybe, machines will possess all the intelligence but no feelings whatsoever and make cold, calculated decisions.
I'm just off to see a man about a dog at Skynet...
The thought also crosses my mind that you clearly have a situation where both human and machine-human people will exist side by side. Who has precedence? Do you have equality? What if the humans simply... switch you off! Conversely and presumably, a human-machine would be able to work 24/7 without the need for sleep. Does this mean that humans are out of a job because they have needs such as sleep?
What about reproduction? You can store as many human machines as you like. Who or what decides to reproduce a mechanical person? Would human machines have the same living requirements as us? Would they even need a body in the traditional sense of the word? Probably not - if you wanted to go somewhere you could just zip down a network.
Which makes me think again - if this is an exact copy of a human mind then they will crave the ability to touch, taste, feel, love... What would happen if the machine gets depressed? A machine can surely do more damage than an individual ever could.
Really the list of questions is never ending.
Finally, the issue of security. At present we can safely assume it's pretty much impossible to hack someones brain - we're fairly secure and amazing advances in medicine and bio engineering aside, I think we will be for quite some time to come. However, as soon as you make something digital it becomes vulnerable to attack. Can you imagine a bot-net of people? Now that would be a problem...
Makes you think, doesn't it?
Ah Arnie. Unlikely cult figure, star of so many top quality movies. Renowned the world over for his gritty, show stopping one liners and oscar winning performances of a life time.
Perhaps not, but he is brilliant enough that someone made an entire programming language dedicated to him. Indeed, if you're any kind of self respecting geek, you should be able to complete your coursework in this language - click here to get started. I'll worry about explaining the syntax to the exam board later after I've got over your sheer brilliance/audacity at following through on such an absurd idea.
Why am I rambling on about Arnie? Well, suddenly we find ourselves living in a time when the ideas portrayed in the Terminator movies are not so crazy after all. The idea that you could create a robot with super human intelligence that could take over the world and start World War 3 seemed to be rather far fetched. Now you have people like Stephen Hawking genuinely having a bit of a panic, both here and here, because they know full well recent developments point towards this being not just probable, but "OMG this is actually going to happen, what are we going to do?"
You may have seen in the news recently that Google built a really smart bit of software that can play the game Go. If you're not a board game aficionado, the basic gist is players take it in turn to place black or white circles, various circles get flipped over as a result and there are literally millions of permutations of possible moves/board combinations, which makes it a computationally difficult task as it isn't possible to analyse every possible move for its strengths/weakness in a reasonable time, so you have to create a program which uses a heuristic or "friday night, that'll do, lads" approach.
Computers really don't like anything other that concrete certainties, mainly down to the fact they are binary devices! There are no grey areas when you live in a world of either 0 or 1 and nothing in between. That's why in programming you have to work on absolute truths. Even in an artificial intelligence system, currently the rules are still based on immutable truths, so you've got issues already and they only get worse when we consider the human condition....
You see, us humans are weird. Take this example:
"What do you fancy for dinner?"
"Dunno, just fancy it."
If you had to think (and I mean really think) about the reasons behind your decisions, you'd have a hard time wouldn't you? I mean, why do you fancy toothpaste for dinner? You might just like the taste, you might have nothing else in the house and be too lazy to shop, you might be pregnant, not realise it and have a craving for toothpaste. Or... You just want it.
None of these reasons work in computer land. They don't get grey areas like this. How do you program a computer to understand "just because", desire or whimsical actions?
Back in the 70's people thought computers would rapidly become more intelligent than humans. Wild ideas were formed in peoples minds and fantastic prophecies about future computers were made. Only someone then spoiled the party and actually sat down to try this out. They then realised it was incredibly, incredibly hard to make a computer even remotely "human" and then gave up and went for a cup of tea instead.
To this day, there is still a Turing prize available for the first truly human behaving computer program.
But.. Even so, there are some awesome developments and they all come from the field of Neural Engineering/Neural Networks.
What's that then?
Basically, the idea of making a program which behaves exactly like neurons in your brain. Sounds good, right? Crash course in brain science:
So now you know.
The program then spends a significant time "learning." If it does something wrong, it goes back and tries another way. Each time it's successful the program strengthens that path and "learns" a skill or ability. This continues until it has learned enough to be useful. This kind of computer learning is relatively new - when I was at university the lecturers were all excited about it but it was so new they didn't actually teach us about it. Which was nice.
Here's a fantastic example of a machine learning in action, this guy made a program which learns to play Super Mario and it really is impressive. Not only that, this will give you great insight into how the process works and the limitations of current work:
While you're at it, you should watch it learn to play Mario Kart too. Then you should all consider doing something similar for your coursework....
How good is that?!
Anyway, on to more important things... Who wants to die?
Well, not me, and it seems that a certain Russian guy and me have a lot in common because if I had a spare hundred million pounds or so knocking about under the fridge, I too would pay lots of beardy scientists to come up with the technology to make me live forever. Until science works out the meaning of life at least.
The BBC reports that Dmitry Itskov would rather like technology to come to the rescue and make him (and the rest of us, he's a generous sort) immortal.
The way he plans to do this is to basically create a copy of every neuron in his brain. This is nowhere near as nuts as it sounds and if you look in to the pace of development in technology, you'll realise that his 30 year time scale is also not so bonkers. We will absolutely certainly have the storage capacity and processing power necessary to emulate a human mind in 30 years. This still leaves a lot of questions to be answered and some of them are a bit of a bugger...
And this raises one final point which is more prevalent to you than me (although please do bring me back from the dead when the technology works) and that is the fact that the young people of today will grow up in a society that has to answer these kinds of questions and more. You will be expected to make decisions that are harder than any that have gone before, decisions which literally impact the very definition of what being alive actually means.
What can we learn from software history?
On August 24th 1995, Microsoft released Windows 95 and it was absolutely mental. The world, quite simply, had never seen anything like it. It was Microsoft's iPhone moment and did exactly for them as the iPhone did for Apple - made them the biggest company in the world. Microsoft were richer than oil companies, banks and... some countries.
People actually stood outside shops in huge queues at midnight to buy Windows 95 (and you thought this only happened for Apple products). They even hired the cast of Friends to make an hour long instructional video on how it worked. Its hilariously bad and you should watch it by clicking here...
Microsoft supplied people with packs so they could have Windows 95 launch parties. Seriously, they did.
The Rolling Stones even made a reported 8 million dollars so Microsoft could use one of their worst songs as the music to the TV advert.
You may not have had a PC in your home in 1995, I certainly didn't, but regardless you really did know that Windows 95 was being released. It was, quite literally, everywhere.
You don't really need an iPhone or an Apple Watch, but it's highly likely when you didn't have one that you wanted one anyway. This is the genius of marketing, selling us things in such a way that even though we don't need one to live our lives, we want one anyway because we truly believe it will change the way we live. This is exactly what happened with Windows 95.
I can remember vividly talking to the one and only teacher in my secondary school that understood computing (he also ran the network) about it. We were about to get about 50 new computers, Pentium 166MMX with 32mb of RAM and Windows 95! I was excited and I didn't even understand why. As an aside, this was at a time when the technician at my school wore a white lab coat, like computers were some kind of secret, magic scientific experiment. The whole setup was quite amusing looking back.
Computing certainly wasn't new in 1995, but really there wasn't any kind of defined, world wide accepted standard. Without standards there will always be a struggle to educate people - which platform do you decide to learn? Which one should you use?
Computers were also still eye wateringly expensive, easily £1500 for a basic model. Most computers of the time were 486 machines running at about 66 - 100Mhz with 4-8mb of RAM. Windows 95 needed 16 to run relatively smoothly and this was a big deal because 8mb of EDO RAM at the time was a significant chunk of cash...
Consequently, PC's weren't really that popular in homes and you certainly didn't need one to get by, do your homework or browse a fledgling internet. Have a look at this graph from the National Statistics Office:
The graph shows the rise of home computing that I've talked about in a previous post, but it was relatively steady and quite stagnant until 1995, then look what happens - very suddenly the uptake of home computers goes through the roof. It's interesting as an aside to note that even in 2010, 20% of households reported owning no computer equipment, which is staggering when you think about it, but also goes to show why you shouldn't presume everyone has access to technology.
This up turn in the graph is, quite simply, the "Windows 95 Effect." Suddenly, because of a piece of software, you realised you needed a PC.
Microsoft had realised that computers were hard to use and needed to be simplified and this thing called the internet was coming...
Take a look at this:
This is Windows 3.1 - the predecessor to 95. I seriously doubt you'll ever have used it - French air traffic control still do, but that's another story. If you want, you can actually try it out in your web browser here and get a feeling for how awesome owning a 486 was back in the day - click here to try it out.
Windows 3 itself was a huge leap forward, but the world didn't take much notice. It wasn't great, the idea was to make DOS a bit more bearable. It was 16 bit (limited) and crashtastic. Even so, it was far more user friendly than a command line interface and allowed more people to just get things done, which is the whole point in having a machine - right?
Why did people get so excited about Windows 95, then?
Well, because it genuinely was a huge, huge leap forwards in so many areas. Like the micro computer, Nintendo Wii, iPod, iPhone to name a few, it did something truly new, truly revolutionary that changed the way people looked at hardware and, also, changed their expectations forever.
The Wii sold millions of units to older people because it was just so easy to use and found a niche in exercise and other physical games. No one had ever opened this door before. The iPhone was the same, suddenly people realised what a smartphone should be, but more than that, it introduced a whole new way of working that was so simple anyone from a 2 year old to your grandmother could use it. Again, this was all new.
The things that make a piece of software, or indeed hardware, such a big deal are surprisingly simple:
The only problem is, these things are actually incredibly difficult to achieve and are often reached by accident or curiosity rather than by design. Take Google, Larry Page and Sergey Brin just joked about what would happen if they tried to download the whole internet, then they did. Then they changed the world and became billionaires.
Windows 95 introduced the following ideas:
Every single one of these was a big deal on their own, put them all together in one go and you begin to understand why it was such a monumental event.
Don't get me wrong, it wasn't perfect. You would have to reinstall it almost weekly and it would crash if you even thought about doing something it didn't like. Sometimes it would just crash because it was bored.
However, fast forward 21 years and what's different?
Those fundamental design ideas haven't gone anywhere. In fact,, the start menu was such a big deal that when it disappeared in Windows 8, they had to bring it back again.
Of course, there have been improvements, least of all is stability. When Windows 2000 came along it was a revelation - you could leave your computer on all week and it would still work, which really was something of a novelty. Windows 98 was so buggy it needed fumigating and Microsoft released a second edition very quickly before users dropped it in their thousands. Windows 2000 wasn't even meant for home users but thousands made the switch when they realised how good it could be. Oh, and they'd upgraded their RAM again, of course!
Windows has reached and passed its maturity stage and is now in decline. No longer is the desktop or laptop the computing device of choice and you are far more likely to use Android or IOS to do your daily web browsing or YouTube watching. No longer is each release a big deal. I stopped getting excited by new versions of Windows at XP, especially when Microsoft planned a stunningly brilliant release (Longhorn, if you're interested) and then ditched literally all of it and released Vista, which is the most appalling piece of software ever written. It was at that point (2005 ish) that Microsoft lost control of the computer market.
Why did this happen? Well, its typical of any product lifecycle, but ever more so in computing. Around once every 10 years something amazing comes along which is then adopted by world + dog. Then begins a constant cycle of upgrades that don't add a great deal other than colour changes, new icons and a feature or two, but nothing which really blows your socks off after the third version (think about what's amazing from iPhone 4 onwards - not a great deal). Then begins a decline in interest before the next big thing.
In our recent past there has been Windows, Apple, Google and Facebook. The biggest problem for all of these is that they're mature, they're out of ground breaking ideas and no one, not even these huge companies have the faintest idea what the next big thing will be - they guessed at smart watches and that hasn't happened yet. The interesting thing is, whatever it is has just been thought up by someone and is about to happen...
So what can we learn from software development? From great ideas comes great software (given a chance), then comes improvement, refinement and mass adoption. Then... Well, inevitably they either miss the boat completely or ruin it. Look at Microsoft Office, I'm willing to bet every single one of you could use Office 1997 and have no issues at all. Microsoft are so desperate now their idea of a new version of office is to give you your colours back (why did they think we all want to work in dull tones of grey??) and change the menu text from lower case to all upper case. There's progress for you.
Not to mention Jonny Ive's appalling IOS7 which saw him take a set of highlighter pens to everything.
Everything you currently know will be nothing in 10 years time. What comes next... well, I'm excited to find out, it's time for something new and brilliant.
It shouldn't ever surprise you that technology depreciates almost as quickly as a car. It's not as bad today as it used to be, but any computer, laptop or console you buy will gradually decrease in value until it's worth a few quid or simply gets consigned to landfill.
Why? New models come out, technology improves, standards change and equipment simply becomes obsolete - it can no longer perform the tasks we ask of our equipment and has no place in a modern home, office or school. Hence, 5-6 years after a purchase, most technology ends up in the bin or on its way to China or India where some poor worker will contract cancer from recycling (or just burning) our e-waste.
Only, something surprising is happening and it is making me wish I'd never thrown my old computers away...
I was fortunate enough to grow up in what is now viewed a golden age of computing. The micro computer revolution had happened and Bill Gates had just made his famous prediction that "in the future I can imagine a PC being in every home." Everyone thought he was clearly a bread based good short of a blanket seated dining experience. Why would you want a computer, what would you even use it for?? Now look around you - smart watch, smartphone, tv set top box, smart TV, games console, laptops... all computers, all seen as throw away devices.
In 1979 a revolution began, fueled by three companies - Commodore, Acorn and Sinclair. Unfortunately, if you read many accounts of computing history, you'll read that Apple were the catalyst at this point and its simply not true - Apple were on their knees with terrible products, high prices and non existent sales. Indeed, it was Commodore engineers that helped Steve Wozniak fix his broken Apple II design and the sheer stupidity of Commodore executives when they ignored the offer of exclusivity when VisiCalc came along (arguably the most important software release of all time) which meant Apple survived to live another day. If Commodore had taken on VisiCalc, I'd be confident to suggest Apple would simply have gone out of business or been taken over.
Anyway, this was a time of the home computer - the micro computer. For the first time ever computers were no longer the domain of large business, banks, governments and universities. Anyone could go out and spend £300-700 and bring home a computer which didn't fill a room and require a new power line to be run to their house.
Weirdly, when products like the Apple I and Commodore PET were introduced people literally didn't have a clue what to do with them. They were seen as novelty items for geeks and electronics enthusiasts. There were no killer applications at the time for home use and so they sold in relatively low numbers to home users. This is the same position the smart watch market finds itself in now - a select few buy them, some buy them for fashion reasons, but no one can say they are an essential item to own, unlike a mobile phone for example.
Rapidly, though, a few things happened. The price of these devices began to fall, faster and more capable chips were developed and an ever growing group of developers began to make useful software and entertaining games. It wouldn't be too far fetched to suggest that games developers were largely responsible for the rise of computing in the home. It was an absolutely fascinating time, no one had ever made a computer make noise, let alone music before. Nor were graphics really understood, and you could count on one hand the number of chips that had been designed that could output a decent image to the screen (again, Commodore/MOS engineers had led the way for years in this field). Imagine living at a time where computers may as well have not existed and then suddenly, in the space of about 5 years, these devices started to appear that allowed you to automate tasks, help you in your job and, given an imagination and some programming skill, enabled people to make worlds of their own in software. It was incredible.
A generation had learned to code and interact with machines what were quite frankly under powered, difficult to use and very fickle. You'd have everything saved to cassette tape or 5 1/4 inch floppy disc if you were rich (or a school/business) and one slight blip in the power supply or you sneezed at the wrong time and they'd crash all over the place and you'd lose everything you did. Hard drives were the stuff of imagination and dreams.
Then came the next wave - IBM PC's and games consoles. Yes, Atari had made some awful consoles in the 1970's and 1980's but now came the time of Nintendo and Sega with a wave of innovation and creativity that was quite simply staggering, and nothing would ever be the same again.
In the late 1980's and early 1990's there was still a lot of skepticism about how useful a personal computer could really be in the home. Don't forget there was no mass adoption of the internet at this point (it was hideously expensive to connect in the UK through your phone line) but people were starting to see how the things they were doing at work could be of use at home. A generation of kids crossed their fingers and hoped the answer to their homework was on the Microsoft Encarta CD-Rom (an encyclopedia) so they could copy and paste it... As a side note, Encarta was a big deal back then and the idea that an encyclopedia could be fitted on to a single disc was quite simply staggering. I remember thinking it was fairly incredible myself, but I didn't have a PC by then so it was of little use, homework would have to be done out of, wait for it, a book!
What people weren't unsure of, however, was how good computers could be for playing games. The game market had gone mental and products like the 8 Bit Nintendo Entertainment System and the Sega Master System had given people their first taste of Mario, Zelda and Alex Kidd - future classics! These machines were no more powerful than the computers available at the time (in fact quite less so in some cases) but the idea of just turning it on and playing was novel and brilliant. They were, quite simply, awesome.
I mentioned these were 8 bit systems. People knew exactly what this meant back then. When you bought a console, home computer or a PC (if you had a spare £5000) they came with manuals that were reminiscent of phone books. Old phone books. Oh god... you've probably never seen a phone book have you?!
When these books plonked out of the box on to the floor, people didn't ignore them. They had no choice. This is what you got when you turned on your £5000 PC:
So what do you do? You read the book! As a result, users knew exactly how their computers worked, why certain things happened in a certain way, where their files lived, how to increase the performance of their system, the importance of sensible file system organisation and so on. They learned how to manipulate things efficiently, write batch files that did common tasks for them and they could truly tailor their computing experience to their needs. Something that is sadly not really known about any more, except for those with an education in computing. Not everything gets better with progress!
The 1990's, then, saw arguably the greatest era of development in the history of computing so far. Prices dropped through the floor as PC manufacturers fought out who could release the fastest machine for the least money. Intel made sure that your computer was obsolete within 6 months.
This was no joke. The year the Pentium 2 was released, CPU speeds went from 366Mhz to 800Mhz in one year! This was crazy. At the time, an increase of 33mhz was noticeable. To spend £1500 on a PC that was basically landfill in 6 months was just unthinkable - but it happened.
At the same time, a golden era of console gaming happened. Nintendo released the SNES and Sega the Megadrive. A whole generation enjoyed the growth of Sonic the Hedgehog, Super Mario, Zelda, Starfox and so forth. Oh and a little known company released a terrible football game called Fifa Soccer...
These were 16 bit machines - twice the processing power than their predecessors and, who'd have thought it, ergonomically designed controllers! Nintendo even learned from their mistakes and designed a console you couldn't balance a drink on top of after about a million NES consoles went bang after an excitable child dumped a glass of milk over the electronics.
Ten years ago you couldn't give this stuff away. Got an old 486? Pentium 166? Pff, not even worth the scrap metal. A Super Nintendo?! Put it in the bin!
Oh how misguided we were.
I owned some machines that are now so rare it makes your eyes water how much people are willing to pay for them. I once owned a Commodore 128D and gave it away. Good ones now sell for around £300. Commodore 64 boxed? You're looking at £50-100.
486 DX4 with monitor? Up to £300.
SNES in boxed condition? They're going for at least £150.
These prices might not seem that high to you, but consider this, 5-10 years ago the same devices would've sold for £10-20 if that. You could easily pick one up for free if you looked in the paper, people wanted rid. Companies literally hired skips and threw thousands of 486 and pentium class PC's away. Now, mark my words, the prices of these machines is only going to go in one direction, and that isn't down. As more and more people want a piece of their past, less and less machines are available as they break, get thrown out or held on to by owners who realise their value. As with anything, sooner or later everything becomes history, and history is valuable! Even though millions upon millions of these devices were made, because of our disposable attitude to goods and the fact electronic devices can and do break, very quickly numbers become sufficiently small as to create value again.
Now, the tide has turned and people actually want these machines. Yes you can emulate and use modern hardware but its the experience of ownership, the reminiscence of a lost youth and the feeling of using the original kit that is driving this market. "Retro" gaming is taking off and is about to get seriously huge. As you grow older, suddenly you can afford all the stuff that was way, way out of your league when you were young. Imagine the system/console you dream of - in 5 years time it'll be pennies - buy it.
If I could give you one piece of advice when it comes to any console, PC or similar it would be KEEP IT! Keep the box, all the packaging, the lot. you stand a chance of it being worth a fortune in the future. If you don't sell it, you never know, the memory itself may be worth keeping it for. Oh and if you have any old Apple kit, its becoming more and more valuable - people are buying it for the design alone.
Computers, it seems, are the antiques of the not too distant future.
Latest Site News: