In recent updates:
Lots of other smaller updates include:
A significant amount of new material is now appearing on the site. To summarise:
The schedule for updates are in this order of priority:
By the end of this academic year:
Next academic year:
Obviously, anything can change in the mean time and some things may appear before others.
Big updates are on the way, finally, including the revision notes for units 1 and 2 of the new GCSE.
If you're not already making use of Office 365 and your 1tb (that's a lot) of online storage, then you should be.
All you need to know is your school email address and normal school password. Then you can log in at the office 365 portal here.
Reasons you should be using it:
Watch the video below for more information.
I've recently been developing an application at school which pulls class data out of our management system and spits out an editable seating plan. It's meant to be an easy to use, press one button and don't worry about it job.
After a few evenings of programming I had something that worked fairly well and got to the point where I needed to test it with some different data sets to make sure it didn't throw its toys out of the pram when presented with unexpected information.
I got to a point where I thought it was pretty much finished and off I went down to the technicians office to share what I'd been doing and get them to test it out for me. So, there I was, clicking away, showing how you could go back, change your mind, change options, load a different data file and so on and it was all working as it should.
No. I lie. As is so typical in computing, what works perfectly for you every single time will inevitably die in a hideous fashion as soon as you present it to anyone. Its like oranges and bananas in your fruit bowl - they're fine until you walk out of the room, at which point they immediately go soft, brown or gooey.
I fired it up, loaded the data file and showed how it works. At which point I changed one of the options and suddenly all the data disappeared. I then uttered the words that all people who work in IT say when they have absolutely no idea what's going on - "that's interesting" which translated means, "I've broken it and have no idea how, why or what I'm going to do about it."
If you were in my class I'd now suggest that you bung a break point in the program at a sensible point and then step through using the debugger to find out what's going on. So, once I'd got to the point where I could replicate the error every single time, this is exactly what I did, and this is where it gets weird.
You can switch off now if you're not interested in the beardy bits...
The data in my program is stored as a list of objects, which when a button is pressed is then passed to another form which then rearranges the data into the order the user selects and then fills the form in the layout requested. None of the code is destructive and once the form is closed, the data is re-sent from the main form if another plan is required.
What was happening was at some point the list of objects was being cleared - which should be fairly easy to pin down. So I started following the program and watching the contents of the list at each stage.
The code would run first time without any issues whatsoever, but click the button a second time and this time when a certain sub routine returned, the list would be empty. The strange thing was that at the end of the sub the list was populated, but on returning was cleared.
"it's a byref, byval problem!" I can hear you say. "Think about the scope!"
I'd agree with you, but if that were the case the list would be cleared every time. Nothing had changed, the exact same code was being run. First time, fine. Second time, not fine.
Scratching around for a solution I then commented out one line of code. This line was before the list was finally populated at the end of the sub routine and the debugger had shown worked fine.
It worked. No bug any more.
This was mind bending for so many reasons - the line didn't empty the list on the first run through, didn't empty the list when debugged and wasn't the point at which the list was disappearing in the debugger - this was happening 3 or 4 statements later! But yet... it fixed the problem.
This happens in programming sometimes. You can do something that absolutely shouldn't work or shouldn't have any effect on anything and yet you'll get bizarre results. I'm still lost as to why this fix worked, but considering my code isn't going to risk anyone's life, I think I'll take it and walk away...!
Inevitably, this can happen in class as well - try explaining that one to 25 perplexed students...
How many of you can type at a reasonable speed, say around 50 words per minute? If you're not sure, test yourself by clicking this link. I tend to average around 65wpm with text that I don't know, higher if I'm typing original text. My typing has actually slowed in recent years, when I was at university and probably doing the most typing I've ever done, I averaged around the 80wpm mark without too much trouble. As with anything its down to practice.
But why is it important? The fact is, to be productive you have to be able to type. No one, despite millions being spent on research and development, has come up with a better method of text entry than the qwerty keyboard. Sometimes it's just the case that an old design is... the right design! Take a look around a Year 7 classroom (and later years, I'm not singling them out) and you'd be shocked at how poor the typing skills of students are - it's not unusual to find students that only use one finger to jab at the keys in an incredibly painstaking process that takes, unsurprisingly, forever.
The technology of each generation does tend to dictate the set of skills they end up either consciously or inadvertently learning along the way. If you go way back, it was usual that students learned to take notes in short hand, or touch type (no looking at the keys, ever). Move forwards to the 1990's and for well over a decade people round the world suddenly became amazingly adept at typing because of the growth of instant messaging over the internet. It wasn't uncommon to have an AOL IM account, Yahoo Messenger and MSN messenger all running at the same time, with at least 6 conversations on the go all at once, with the taskbar flashing like a police car at you, demanding your attention. Every young person at the time knew how to alt-tab their way around each box and hammer out messages at well over 100wpm without thinking - they learned to adapt because you had to!
Strangely, that seems to have been the golden age of communication online. Since that time all the messenger services have died off for one reason or another and nothing really has replaced them. In that time, we've had the advent of touch screen devices and a real shift away from traditional PC use, meaning that today's students simply don't use computers in the way people of an older generation are used to, or expect.
This is actually causing quite a problem. I asked a class of Year 7 students recently, "how many of you use a desktop PC or laptop at home on a regular basis?" Approximately 3 hands went up - 10% of the class. At first, this surprised me. We asked other classes and the numbers never rose much above 20% of students in each group. This goes a long way to explaining some of the problems we have when students arrive at secondary school, especially the length of time they take to adjust to our way of working, but particularly to how a computer works. The number one problem used to be that students could not understand the idea that one "drive" was theirs and another was shared on a network and they can't save to it. Now it's that coupled with the fact... students don't really know how to use a traditional computer at all!
Why was I surprised, though? My own desktop computer sits in a cupboard at home gathering dust, not because I'm not interested in computing any more, but simply because there isn't a need to use it. Desktop PC's are anti social machines that require you to sit usually out of the way and, these days, don't actually offer any compelling functionality unless you're doing tasks which require a powerful machine. This is compounded by the fact I now have a very portable and powerful laptop in the form of a Surface Book which means I have even less motivation to use a desktop, but... I still use a PC every day!
So what are younger people doing? They're using tablets and, predominantly, phones. Why wouldn't they? The internet is available in its entirety in their pocket, they can communicate and listen to music, share experiences and have everything they need in one place. They're extremely adept at using these devices and are quick to share new applications, methods of working or just things they think are interesting. In this way, they're the equivalent of our MSN generation, they've learned to use the technology available to them and adapted to it perfectly.
The outcome of this is, strangely, that we have in many ways returned to the early 1990's where there is a about to be a real need for the teaching of "ICT" skills again. Students are brilliantly aware of the technology that is prevalent to them, but woefully ill equipped to deal with a working world that they will move in to. Yes, I acknowledge that touch screen/tablet technology will become more acceptable in business, more mainstream and more integrated with traditional machines, but the bottom line is, to do decent work you are going to need to be adept at using a mouse, keyboard and desktop operating system. No one has come up with a better method of working and I cannot see this changing in the short to medium term.
One thing you can be certain of, however, is that these technologies will merge. Look back and you see the perfect example being modern smartphones. Where once you needed a computer to browse the "full" internet, a music player to store your music and a phone for communications, they are now all one device. Laptops are already changing to have screens which detach to become tablets but no one has found the perfect medium yet and may never do.
Apple have spent millions on research into touch screens and traditional computers and Steve Jobs, before his death was adamant there was no way that touch screens worked on traditional computers, the two just don't match, it's interesting, then, that Apple are now desperately trying to sell iPads with keyboards and touting them as "computers." The fact here is that Microsoft may well have accidentally gone down the right path after all with Surface in that it's a fully featured PC in a convenient form factor. Only it's absurdly expensive...
Anyway, the bottom line is, to be productive we need typists. We need people who can work across a range of technology and, sadly, right at the time when our students need it the most... we've killed off ICT. Nice work, CAS!
Love it or hate it, people young and old are going nuts for a bit of Pokemon Go. On the face of it, it all seems rather pointless - you walk around to find characters and then flick them a bit. But there is much, much more to it than this and it's heralding a new era of games that will completely change how people think of gaming.
Since the release of Pokemon Go, the value of Nintendo has risen by over $7bn as people have been rushing to buy shares in the company. Why? Well, investors in general are not stupid people, they're responsible for vast sums of money and amongst other things, our pensions, so nothing too important. In simple terms, they're gamblers who have unlimited amounts of someone else's cash to waste. Sounds good to me.
So why have investors been betting on Nintendo, or at least showing strong support for a new concept? The reason is because this is the first successful example of mass adoption of Augmented Reality and the potential market is absolutely massive - big emerging markets are like fairy dust, Christmas, rainbow riding unicorns and pots of gold at the end of very large rainbows to investors because they can bet low and make absurd profits when it all goes mental.
Augmented Reality is in it's total infancy at the moment and will either surpass or at the very least compliment the very best virtual reality. The fact is that augmented reality is more accessible than VR, has a much wider range of opportunity than VR and solves all the problems of VR without trying - you don't need to stay in your house/one location, you're not going to break your nose running in to a wall or smash your house up because you can't see it and it won't make you redecorate the place with vomit because of motion sickness.
Then there's the potential, not just creative but social also. People are already getting really excited that technology may actually hold a solution for the one big problem it has helped to create - a distinct lack of movement. Now people are running around town, chasing imaginary characters (a wonderful image of a crazy society if ever there was one) in an effort to "collect them all" which, whether you like the game or not, really has done more to get people moving than the sales of millions of Wii Fit boards that now gather dust under millions of TV stands ever did.
The social aspect is huge as well, you're now very likely to bump in to a fellow Pokemon player who is after the same things as you, the potential for unintended meetings are really high, which means social connections that would otherwise never have happened, now will. On top of that the idea that you can compete with rival teams to "take over" an entire town or region is utterly brilliant. Often the most successful ideas are the most simple.
Think about politics - parties spend millions trying to motivate people to get out of their houses and vote, to "take over" a town or city in the hope of a candidate becoming elected. The gamification of this idea has had drastically more success than political investment ever has or probably could and the potential to apply this simple competitive element to other areas is huge.
Finally, there's the very idea that you can simply take anywhere, any space, any area and turn it into the most awesome game you ever played. With advances in motion sensing and wearable computing it is not difficult to imagine playing a future version of Doom or similar where you literally walk in to an empty building complex, put on your AR glasses and suddenly you're instantly transported into an amazing game world where you're not a passive participant but actively in the game - your physical movements all translated in to the game. The level of involvement, emotion, stress and so on will be amplified to a level never seen before. All those empty, derelict buildings will suddenly gain a value they never had as the latest level in a game.
If you take it to the extreme you could find teams of people travelling all over the world to play in games in all kinds of countries (as you would now in Uncharted and other games for example, as you follow clues to an eventual goal). It has the ability to create an entirely new form of travel agent, one who will create your very own personal, world wide adventure of a lifetime.
This, ladies and gentlemen, is why a $7bn increase in company value is just the beginning. Done right, this has the ability to make companies like Apple suddenly look like a corner shop.
Computing suddenly got very, very exciting again.
If you work in computing or study it for any length of time, you'll find that the more beardy someone is, the more they embrace and promote unnecessary complexity and the more they frown upon those who would rather live their lives than debate the relative merits of type safe variables.
In many ways, computing is a stunning example of the outrageous pace at which we have managed to develop technology which has changed, and continues to change, the world. In others, it's a frustrating reminder that basically a unique club of furry toothed geeks have managed to continue to keep the doors firmly closed to mere mortals for far too long and shows exactly what happens when you let a group of "unique" people shut themselves away unattended for a significant period of time whilst they come up with "standards."
What am I on about this time?
When computing kicked off, the idea wasn't to make it complicated. Indeed, the complexity of early machines was solely down to their limitations and during the micro computer revolution, great efforts were made to make computers usable by everyone. They were pitched at families and came with great manuals which could get anyone started with programming - this is exactly why BASIC exists and where it came from. It was indeed basic and meant that virtually anyone could knock up a simple program in next to no time, and thousands of people did knock up the odd utility or program and didn't need, or want to, consider themselves "a programmer." They just used the facilities of their machine and it was just another useful thing it could do.
Then there were the dreams of what might be. One popular prediction used to be that there would come a time when computers would program themselves. We would simply give a set of parameters and it would go away and generate the code for us.
We now know that this is a wildly complex task and unlikely (but not impossible) to ever happen. However, it raises this point - the idea was always that computers were designed to be appliances.
I'm sure you'd all agree that when washing machines were invented, they were a huge leap forwards in technology and convenience. Over time, they too have developed. Have they become less accessible? Have they become more convoluted and complex? Absolutely not. It wouldn't be acceptable to expect someone to become an expert in Washing Machine Mechanics and Operational Skills before they could clean their clothes and everyone would just buy another brand or product.
So why do we suffer this in computing? It's a total absurdity.
Why did Apple blow the phone market out of the water and kill Nokia, a global giant, literally over night? Was their product technically more capable than anything that had ever appeared before? No. Did it contain new technology that had never been seen before? Certainly not. What they had was simplicity. In bucket loads.
But this isn't about hardware, or software. Don't get me started about software - how did it take one company to show everyone that the idea was to simplify everything and then you'd sell so much you could literally Scrooge McDuck into your profits?
No, this is about programming languages.
Without languages, we know there is no software, there is no computing. In the beginning, there was machine code - and we saw that it was mental. Then there was assembly, and we saw that it was efficient, but also mental. Then there were primitive low level languages, and we saw that they were human to mathematicians with an unhealthy love of the Greek alphabet. Finally, someone noticed that it'd be nice to be able to talk to a computer in something that resembles, oh I don't know, the language we speak?! And lo, we saw that it was good.
Then some crazy people said, "wait a minute, maybe eventually we'll be able to tell computers what we want to do pretty much in plain English, and it'll be so easy everyone will be able to make their devices do whatever they want! It's the pinnacle of computing!!"
and the masses doth reply, "why, this is quite something! Surely this is the most sensible idea we've ever heard!"
and all this time, the geeks were mumbling to themselves and grumbling to each other, for they were unhappy that their baby would be taken away from them and given to people who see sunlight on a daily basis and aren't worried they may be soluble in the shower. "Burn them! Witches!" they cried, and the dream of easy programming died.
If you think I'm joking, look in to it. Programming got progressively easier and the 1980's and 1990's produced a generation of computer literate, code savvy users who are responsible for the products we all use today.
Then languages just became more and more convoluted. Paradigms were introduced that seemingly exist only to satisfy the sadistic urges of a very select group of technical morons who will bang their "but it's convention and correct practise" drum until someone sensible equips them with roller skates and nudges them down a greased flight of stairs whilst wearing teflon jackets.
Now we have a situation where even governments are in new trousers mode because we are facing a situation where there aren't any developers any more, because "true" computing (read coding) is once again the realm of those who are blessed with beard and everyone else can just scratch their heads and dribble a bit. Why do you think the BBC just spewed out thousands of "micro bits" to schools across the country in a desperate attempt to get kids "enthused about coding."
Don't get me started about those either.
Whilst everything else has moved forwards at a truly astonishing rate, programming languages have the equivalent of an obesity epidemic. They've stopped exercising, stopped trying to better themselves and instead have just got fatter to the point where they'll soon have a government ad campaign targeted at them to show just how bad for your health they are.
In 2016 it is unthinkable that we still battle with programming languages that will throw a fit if:
I spoke with a real world developer today who told me, in a way where he expected me to be aghast with awe, that he'd spent 8 hours trying to find the reason why his latest project didn't work.
Do you know what it was?
He'd typed a word in lower case.
He told it like a veteran recounting a tale of single handed bravery in the field that would make Arnie look like a blubbing wimp. All I heard was "I program in a language that normal people would question the sanity of."
Forget computing for a second, imagine I told you that if you were at work one day and you made a spelling mistake, it would not only stop you progressing, stop you working for an entire day, but it could potentially mean that your work would be totally broken, you'd suggest I'd gone slightly mad and should perhaps go for a lie down in a dark room.
Yet the absurdity is programmers accept this on a daily basis. Not only that, they embrace it as some kind of weird comfort and use every triumph over these arbitrary rules, set by sadist nut cases who sit on standards committees, as some kind of proof of personal triumph, of their growing ability to beat the system!
This is all wrong. How did we get to the point where the greatest invention in human history has once more become an incomprehensible mess, accessible only by those who invent it or bask in pointless, needless problem solving.
It doesn't need to be like this. I understand there are situations where you need total control, where you need to know that every clock cycle executes instructions in exactly the way you expected, that we have totally safe and predictable systems. But if you seriously think that day to day developing needs to be as difficult as it is, then you have a screw loose.
By now, we should be moving to a place where programming is a pleasure. Where development environments literally enable you to design lovely interfaces, describe behaviours and the code is largely written for you and when you actually do need to dive in, it's as simple as writing some structured sentences - and I mean sentences, not object.property.attibute.arbitrarypointlesslibraryfunctioncall(lotsofpointlessparameters)
If your language cannot be picked up by a 10 year old, then I don't want to know. I learned to program at the age of 7 and this was only possible because basic was so simple and so forgiving that you could just get straight in to the good stuff - making your ideas come to life, trying to make your imagination appear on a screen. It was brilliant, because the language didn't stand in your way. Now, I feel sorry for anyone trying to learn any industry standard language.
Finally, the biggest abomination I have ever witnessed is Greenfoot. Anyone who believes that Java is a language suitable for small children is a stripy blanket short of a picnic. When it is acceptable to use conventions where you name objects with exactly the same name as their class, you have a problem. "Hey, I know, lets write deliberately confusing code that will introduce errors without people even trying! Lets make it harder by only differentiating between two very different things with only a capital letter! lololololololololol"
That's an actual line of Greenfoot/java. Imagine you've never programmed before. That's really intuitive isn't it? As intuitive as stabbing yourself to death with a goldfish. Sort it out, developers, because this was never the dream.
Select.... From.... Where
Select fields From table Where criteria
SELECT... FROM.... WHERE....
A thought occurred to me after I wrote yesterdays post about neural networks/machine learning and that is:
If you recreate a brain using computer hardware and software, you have several problems:
This raises a whole world of legal, moral and ethical issues to which I don't believe there are any perfect answers. Just a few initial thoughts from that list above make things even more complicated...
You would (will) be living in a society where technology and AI have finally evolved to a point where machines are capable of making decisions. Indeed, as most things will be automated, they will be machine controlled. The advantages of having AI control of machinery are numerous and so it's likely that even menial machines will have some form of intelligence. As an aside, if you haven't watched Red Dwarf, you really need to see the episode where Lister meets Talkie Toaster:
Imagine if those awful self service checkouts actually had some intelligence and you could talk to them? This kind of technology is coming, the beginnings are already available today and history tells us that these things only get better and better with refinement over time. Things like Siri voice control will be as normal and ubiquitous as handles on doors.
So, if our machines can think, they can also very easily turn you off and if you've been turned off you're no longer in control.
I am fascinated by the nuances of our personalities, how our minds work and what makes us... us! Because we don't truly understand how we become who we are, we cannot predict whether a machine with the same neural capability will develop skills such as empathy, understanding, kindness and so on. Maybe, machines will possess all the intelligence but no feelings whatsoever and make cold, calculated decisions.
I'm just off to see a man about a dog at Skynet...
The thought also crosses my mind that you clearly have a situation where both human and machine-human people will exist side by side. Who has precedence? Do you have equality? What if the humans simply... switch you off! Conversely and presumably, a human-machine would be able to work 24/7 without the need for sleep. Does this mean that humans are out of a job because they have needs such as sleep?
What about reproduction? You can store as many human machines as you like. Who or what decides to reproduce a mechanical person? Would human machines have the same living requirements as us? Would they even need a body in the traditional sense of the word? Probably not - if you wanted to go somewhere you could just zip down a network.
Which makes me think again - if this is an exact copy of a human mind then they will crave the ability to touch, taste, feel, love... What would happen if the machine gets depressed? A machine can surely do more damage than an individual ever could.
Really the list of questions is never ending.
Finally, the issue of security. At present we can safely assume it's pretty much impossible to hack someones brain - we're fairly secure and amazing advances in medicine and bio engineering aside, I think we will be for quite some time to come. However, as soon as you make something digital it becomes vulnerable to attack. Can you imagine a bot-net of people? Now that would be a problem...
Makes you think, doesn't it?