Are tags a thing of the past?

Are <meta> tags a thing of the past?

Well…yes, and no! This article will explore which <meta tags> help and which hinder and are obsolete.

Traditionally, meta tags used to be placed in the head section of a webpage, hidden away from the public, they were used to help bots find out about your website contents. An example of such is given below.

With advances in search engine optimisation technology (SEO) the importance of meta tags has been superseded by the relevance of the page contents, rather than what the developer thought were the best keywords. Often, a website could receive a higher ‘click-rate’ than it should because the developer included popular keywords in the meta tags. If you wanted to drive traffic in ‘the good old days’ you would simply include keywords popular at the time – even if they had absolutely nothing to do with the site content! Wack “Spice Girls” in as a meta tag and you would get a higher ranking. This eventually became a real issue, with genuine websites ranking lower than those with irrelevant keywords. The search engines soon developed better algorithms that searched keywords within the page and marked up with – <h1><h2><section><article><strong><em><id=> -etc. In fact, it could now be argued that keywords hinder your ranking by restricting what people are actually typing to find your site. It can also have a detrimental impact on your website if your rivals can see the keywords you are using to try to attract customers.

<meta name=”description”> however, can be a very useful tag. In using the meta description tag, you can directly control what is displayed in the listing for each page displayed in the search engine results. Without this tag, the search engine will return what it thinks is a suitable description of your site.

You don’t need to restrict this tag to just a site description either, you could include phone and contact details so they are shown without the need to access your page. This can be very useful for a GP surgery were someone just quickly needs to find the phone number to book an appointment. It is also the quickest way to impact on the searcher. It is an easy way to tell prospective clients (in less than 155 characters) what you are about before they click. If your description better fits their need than a competitors description, you are more likely to get the clickthrough. So the description tag is not mandatory, but it can help you control the description of the site and potentially drive traffic.

The robots tag is invaluable if you want to “hide” certain pages or content from public search engines by telling the bot what not to crawl and what to crawl. You may have an employee or student area that you do not wish to include in a public search for instance. In which case you would just tell the robot to ignore those pages with a <meta name=”robots” content=”no index, no follow” /> in the header of the page you want to hide.

The Open Graph (OG) meta tag enables a web site/page to “become a rich object in a social graph”. In other words, sites like Facebook will use the OG tags to better understand the site content and display images and information about it. These tags are crucial if you want to control how your site looks when shared on social media. Although this may seem regressive- having to rely on meta tags again – they are based on RDF protocols and, if used, have 4 mandatory tags. If we use IMDB as an example, the following then is a list of required tags to create an open graph object =

  1. og:title – The title of your object as it should appear within the graph, e.g., “The Rock”.
  2. og:type – The type of your object, e.g., “video.movie”. Depending on the type you specify, other properties may also be required.
  3. og:image – An image URL which should represent your object within the graph.
  4. og:url – The canonical URL of your object that will be used as its permanent ID in the graph, e.g., “http://www.imdb.com/title/tt0117500/”.

This would translate in HTML as

<html prefix="og: http://ogp.me/ns#">
<head>
<title>The Rock (1996)</title>
<meta property="og:title" content="The Rock" />
<meta property="og:type" content="video.movie" />
<meta property="og:url" content="http://www.imdb.com/title/tt0117500/" />
<meta property="og:image" content="http://ia.media-imdb.com/images/rock.jpg" />
...
</head>
...
</html>

So then, in conclusion, it can be said that with the exception of the description meta tag – meta tags are dead? I would reluctantly answer – “if the purpose is to aid SEO, then yes. The meta tag is dead. However, if you want to control its description in a search engine, or the way it displays on social media then you still need to use them. OK, so the OG meta tag is not a meta tag in the traditional sense, it still needs to be placed in the header. This is something WordPress will not allow without the use of Plugins and trickery – but please, do NOT get me started on WordPress! WordPress is to Web Dev what “nighthawks” are to archaeologists!

So if you want great SEO, write well-formed code utilising the power of HTML5 with a sprinkling of meaningful <meta tags> that described its content, directs bots and creates Open Graphs for social media inclusion.

#mftLearnToCode #ttrLearnToCode https://genericweb.co.uk/

Please follow and like us:

Teaching Python vs HTML

So, I was talking to a colleague the other day and he was asking me about my experience teaching Python at secondary schools, mainly – “is Python forgiving”? My simple answer was – No! Python is not forgiving at all in comparison to HTML.

I now primarily teach the web trifecta – HTML, CSS and JavaScript and the question came about because of the forgiving nature of HTML. Students can submit some shockingly ill-formed code that will still display in a browser.

For instance –

<p>hello world</p>

will display “hello world” in a browser. Yes, it works, but it is far from well-formed or best practice, such as –

<!DOCTYPE html>

<html>

<head>

<title>better code</title>

</head>

<body>

<h1>Hello, World!</h1>

</body>

</html>

Python, on the other hand, is far more pedantic about its syntax rules. Often a simple misplaced / or : or “ instead of ‘ will render the code useless. I would spend hours of teaching code locked in debugging to find where a student had forgotten that colon! Don’t get me wrong, I love debugging – no, I really love debugging, and I am good at it because I enjoy it. However, debugging code because a student cannot copy from a book without dropping a colon is not so much fun.

If I am to compare the two languages in a teaching environment, teaching Python is probably easier. Why, you ask? Well, because-

·        it has to be right to work!

·        It has to be indented

·        It has to be well-formed

·        It has to be structured

·        It has rules that must be followed

·        The syntax is exacting

Sure, this is harder to teach – forget “free will” and “conform” your code, can stump the more creative students, but you are industry ready if you learn it right. Good old slap happy HTML will often work no matter how badly the code is written or formed. This makes it harder to teach because they know they can get away with –

·        A lack of structure

·        No indents

·        Ill-formed code

·        Missing syntax

·        A real “DIY” mentality

Therefore, teaching HTML that works is very easy – teaching HTML that is industry ready is really not!

“Well my webpage works doesn’t it sir?”

“Well…erm…yes, but it is a mess and the code are all over the place and is very hard to follow”

“Whatever, sir…it works and that’s good enough for me, so the heck with it!”

OK, so back to the anecdote – Python.

After getting so frustrated with putting colons and the alike into student’s code, I started playing a game. We had loads of old keyboards at school so I got a load and removed the .,?;:”’ etc from them. If a student asked me why their code wasn’t working, I would look. If it was logic I would explain and work with them on it. If it was syntax, I would go to my desk and pick up the corresponding marked key and place it on their desk. Obviously, if the key they were given was a colon, they knew they had to look for the missing colon in their code. This would encourage them to look for syntax errors themselves, rather than have the embarrassment of receiving a broken keyboard key placed on their desk.

Gosh…I miss teaching Python. However, I love teaching the web trifecta here at The Training Room.

No matter what language you learn – learn to code!

#ttrLearnToCode #ttrIT

Please follow and like us:

The Evolution of the Computer Science GCSE

During the 1980s, computer studies and computers were in their infancy[1]. The BBC Microcomputer was the only real choice for schools at the time. This early PC had very little in the way of end-user applications and relied on a BASIC interpreter to be loaded which meant the user needed to learn to program and build their own applications[2]. This resulted in schools focussing on teaching how to program a computer alongside how the computer works. As computers became more popular and more applications became available, the focus on teaching switched to how to use a computer and its applications and ICT was born during the 1990s[3].

During the 2000s, it was becoming clear that ICT was no longer fit for purpose and that students were leaving school with skills in digital literacy, but not in computing. ICT began to receive negative reports from industry, educators and students as it was seen as a boring and repetitive subject that only taught how to use Microsoft Office[4].

It was not until 2010 that The Royal Society, based on information from the Computing At School group (CAS), Ofsted, Microsoft and Google (among others), set up an Advisory Group Chaired by Professor Steve Furber FRS[5]. The reports first recommendation was to stop using the acronym ICT because of its “negative connotations” as quoted below.

  • “Recommendation 1 The term ICT as a brand should be reviewed and the possibility considered of disaggregating this into clearly defined areas such as digital literacy, Information Technology, and Computer Science. There is an analogy here with how English is structured at school, with reading and writing (basic literacy), English Language (how the language works) and English Literature (how it is used). The term ‘ICT’ should no longer be used as it has attracted too many negative connotations”[6].

Aside from the name ICT, it was becoming clear that the “current delivery of Computing education in many UK schools is [sic] highly unsatisfactory” and needed addressing[7]. Indeed, even the UK Education Sectary at the time, Michael Gove (May 2010 to July 2014), was quoted as saying the ICT curriculum was “demotivating and dull”[8]. This was brought into the headlines by the executive chairman of Google, Eric Schmidt, when he addressed the Edinburgh TV festival in 2011 saying,

  • “I was flabbergasted to learn that today computer science isn’t even taught as standard in UK schools. Your IT curriculum focuses on teaching how to use software but gives no insight into how it’s made. That is just throwing away your great computing heritage”[9].

As a result of growing pressure from industry, Michael Gove reported the UK Government would replace ICT with a new Computer Science curriculum from September 2012 (the start of the UKs academic year). In that speech, Gove posited,

  • “Imagine the dramatic change which could be possible in just a few years, once we remove the roadblock of the existing ICT curriculum. Instead of children bored out of their minds being taught how to use Word or Excel by bored teachers, we could have 11-year-olds able to write simple 2D computer animations”[10].

Visit my Interactive CV

Bibliography

[1] Doyle, GCSE Computer Studies for You.

[2] Brown et al., ‘Restart’.

[3] Ibid.

[4] Coquin, ‘IT & Telecoms Insights 2008: Assessment of Current Provision’.

[5] Furber and et al, ‘Shut down or Restart?’, 12.

[6] Ibid., 18.

[7] Ibid., 5.

[8] Burns, ‘School ICT to Be Replaced by Computer Science Programme’.

[9] Schmidt, ‘Edinburgh TV Festival’.

[10] Burns, ‘School ICT to Be Replaced by Computer Science Programme’.

Please follow and like us:

No more ICT…….Please!

It was not until 2010 that The Royal Society, based on information from Ofsted and Microsoft (among others), set up an Advisory Group, Chaired by Professor Steve Furber FRS . The reports first recommendation was to stop using the acronym ICT, yet here we are – 7 years later – still using IT! Why? #ttrLearnToCode

Please follow and like us:

Crossing the Digital Platforms in WebDev

HTML5, CSS3 and JavaScript undoubtedly make creating cross-platform websites, but it is still essential Web Developers/Designers thoroughly test their sites before making them “live”.

Most websites will have been designed and built on a P.C., MAC or laptop, but increasingly they are being viewed on tablets and mobiles. This is often overlooked. As a teacher of computer science, we would often ask students to login to a site, or interact with a site to do some homework etc. Increasingly, students would reply – “we couldn’t get the website to work on my mum’s tablet or my phone sir”. “Don’t you have a laptop or computer at home”? “No sir, we don’t need one”.

Granted, it tends to be the youth that prefers mobile devices, but in 2016 the amount of people accessing the web via mobiles and tablets surpassed users of laptops (hallaminternet). It is also fair to assume that not everyone wants to download and install another mobile app that they will barely use. Therefore, it is crucial that as web developers/designers, that we create responsive sites that transfer the content to any platform without the need for the user to resize the site’s content. Even if CSS3 is not your strength, there does exist plenty of responsive templates on the net. Some of these – TEMPLATED – are provided free of charge and royalty free (providing you reference them). Even as a “seasoned pro” basing code on pre-existing templates saves time and can be used to show the client quickly what to expect before you commit to the lengthy process of bespoke coding. I am certainly not suggesting we plagiarise, but if we use templates and comment on our code professionally, I see no harm. As a pro, you should be able to delve into the code and personalise and change the content anyway, it is just a useful starting point.

Anyway, I digress a little. Back to the main thread. I am hoping that – using HTML5, CSS3 and JS – we can move away from mobile specific content, and more towards a responsive page that will display the same content optimised to the device, it is viewed on. This is will help greatly with “branding” and provide a consistent experience for the user. It also means the developer just needs to design each page once and apply a CSS to it, rather than make several iterations of a page designed to display on different devices.

——————————————————————————————————-

I hope this article has gone some way in helping you understand the importance of UPDATES. If it has…please LIKESHARE or FEEDBACK the post. Thank you.

About the Author, – Dr Richard Haddlesey is the founder and Webmaster of English Medieval Architecture in which he gained a Ph.D. in 2010 and holds Qualified Teacher Status relating to I.C.T. and Computer Science. Richard is a professional Web Developer and Digital Archaeologist and holds several degrees relating to this. He is passionate about the dissemination of research and advancement of digital education and Continued Professional Development #CPD. Driven by a desire to better prepare students for industry, Richard left mainstream teaching to focus on a career in tutoring I.T. professionals with real industry ready skills that matter at The Training Room.

#ttrIT #ttrcareerinIT #ttrLearnToCode

Please follow and like us:

Cloud Chaos?

As a student, I was always told – no forced – to create folders on my computer to store all documents pertaining to one course or project. All files, no matter the extension or type, go in a folder so they can all be found and all the links to files will work. The same was true for your desktop, keep it clean, keep it in folders. I am a huge fan of arranging all my work into folders and logical places on my physical drives. That often means I do have to use USB sticks to transfer work over to another device, but at least you can. So, for years now, I have been meticulously placing the right file in the right folder to enable myself to locate the said file with ease and to ensure if I copy a folder over -via memory stick – to another device, that all the “stuff” is there. Makes sense I hope?

Now, we have cloud storage, cloud backups, cloud software, cloud this and cloud that. Most major companies offer cloud storage or backup;

·        Microsoft OneDrive

·        Google Drive

·        Adobe Creative Cloud

·        GitHub

·        Kindle Cloud Reader

·        Dropbox

·        Sony Memories

·        Canon/Nikon for photos

·        Samsung

·        iCloud

The list goes on. Most of the list above create a folder on your physical drive that is “synced” to the “cloud”. So, the work you do on MS Word will be in a OneDrive folder, while the image you created will be in your Adobe Creative Cloud folder. More on this later, but I think we need to look at what the cloud is first?

The “cloud” is a physical location! It is not a cloud of data that just floats in space waiting for you to grab at it. The cloud will not rain data when it gets near a hilltop or the North of England. It is located on various servers at various server farms around the globe (depending on the size of the company that stores your cloud data). Those servers are constantly connected to the World Wide Web and the internet so that you can access your data anywhere at any time. The data is across several different farms, often floating back and forth (a cloud of data) to ensure the data is always backed up and accessible should a server fail. So then, the cloud is rather fantastic! You can work on any device, at any location, on any platform, using various software at any time – within reason and with some caveats. On top of this, you can rest assured you have a copy of a file backed up somewhere. Even if you delete a file, there is a greater chance of recovering it or at least finding an older version of it.

So, all this is great! So why the title “Cloud Chaos”? Well, if we think back to the folder scenario I have already eluded to, we now need to create a folder for each cloud provider rather than for each project. My simple logic brain now struggles to think – which cloud provider has what file in which subfolder for which project? For instance, I could be planning and designing a web page. I may write all the text in MS Word first, then save the work in my OneDrive so I can access the document on my phone too. Next, I create an image in Photoshop and save the image in my Adobe Creative Cloud folder. I may start my HTML code in Dreamweaver and save it too to the Adobe Creative Cloud, but my colleague needs to work on the code too, so I place the code on GitHub so we can share the code in real time without having to give them access to my Adobe account. So now I have different files in different cloud folders. This is amazing, that colleagues can share work and be assured they are working on the latest iteration, however, my poor dyslexic mind cannot cope easily with files in various locations rather than in just one location that I control. So, on one hand, our workflow has the potential to grow and the potential of collaboration is almost limitless (bandwidth and latency aside). This is, understandably why the cloud is so popular. Especially, as most cloud services come as an add-on to your software package. Or, in the case of Dropbox, are essentially free – unless you need larger storage through a business account. It really is quite amazing!

So, why the chaos? Well, simply because your files are now spread across various cloud providers and it is not until you bring it all together can it be stored in one location. So, if we take Dreamweaver as our example, we need to place all the associated files in a root folder so the website can link to all its assets relatively. All the image files will be in the default image folder, HTML in another, CSS and JavaScript in others. So, we need to take the original files out of the cloud folder and place them on our server so the links all work etc. For most people, this probably is not an issue, but for me – with dyslexia and short-term memory loss – it is a nightmare! Did I move the file? Where is the file? Am I working with the latest version? Have I put all the files in the right folder and deposited it all on the server? I agree, the cloud makes life more collaborative and intuitive for most, but it is a new way of working that does take time to adapt to. Clearly, I am slowly getting used to disparate folders and locations.

How do you manage all your files and folders? Do you have your files across several paths and folders? How do you collate and share your workflow?

——————————————————————————————————-

I hope this article has gone some way in helping you understand the importance of UPDATES. If it has…please LIKESHARE or FEEDBACK the post. Thank you.

About the Author, – Dr Richard Haddlesey is the founder and Webmaster of English Medieval Architecture in which he gained a Ph.D. in 2010 and holds Qualified Teacher Status relating to I.C.T. and Computer Science. Richard is a professional Web Developer and Digital Archaeologist and holds several degrees relating to this. He is passionate about the dissemination of research and advancement of digital education and Continued Professional Development #CPD. Driven by a desire to better prepare students for industry, Richard left mainstream teaching to focus on a career in tutoring I.T. professionals with real industry ready skills that matter at The Training Room.

#ttrIT #ttrcareerinIT #ttrLearnToCode

Visit his Blog and Website

Read more about Dr Richard Haddlesey BSc MSc PGCE PhD

Please follow and like us:

I don’t wanna be Cyber Attacked – what can I do?

A question I often get asked is should I accept all these annoying updates from Microsoft.

YES! Is the short answer, and really the only answer, but many don’t or won’t?

OK, so there are caveats. If you only connect to your own network at home, never use the World Wide Web, never do anything that requires passwords and do not use internet banking, then no, you do not need to update. If you do, however, you really should!

In a recent whitepaper, DUO* suggested that over 65% of devices running Microsoft Windows are running the outdated Windows 7 released back in 2009. The operating system was withdrawn from support in January 2015 as it was too vulnerable to attack, although, Microsoft will still release minor security fixes until 2020. That, removal of support, was over two years ago, so if you are still running Windows 7 and accessing the World Wide Web, you are extremely vulnerable to attack! Microsoft even offered a free upgrade to the more robust Windows 10 for free, yet people still did not switch. I appreciate that people still like Windows XP and Windows 7, but they are just no longer safe or relevant in today’s online world. The software is changed and updated for a reason, it is not just to make money – after all, they gave it away for free – it is to plug the data holes and keep you safe online. This is not just a problem with out of date Windows devices, it is also the same for Android and iOS on Macs. Yes, the Mac is just as vulnerable to attacks**, it is just with about 7% of the P.C. market, you tend to hear less about it***.

  • If you continue to use Windows XP now that support has ended, your computer will still work but it might become more vulnerable to security risks and viruses. Internet Explorer 8 is also no longer supported, so if your Windows XP PC is connected to the Internet and you use Internet Explorer 8 to surf the web, you might be exposing your PC to additional threats. Also, as more software and hardware manufacturers continue to optimize for more recent versions of Windows, you can expect to encounter more apps and devices that do not work with Windows XP. —Microsoft

Now, if you take your “old” device into work to connect to their network, you are now making your entire company vulnerable to attack! Once you open a port from your device to the work intranet or Wi-Fi, you are giving attackers – via your outdated software – instant access to the network. Not only that, you are allowing a would-be attacker easy access to an otherwise secure business network. At the very least, everything you can access an attacker can also access. If they are sophisticated, they can potentially gain access to all the network. All this, just because you really like older versions of Windows! At my former place of work (a secondary school) a teacher brought in their old XP laptop and opened an email, they received from a person they did not know. Unwittingly, by opening that email on the school network, they introduced ransomware onto the network. This encrypted the entire school network and all drives. For nearly a week, the school network was unusable while the technicians worked to restore previous network backups. When the system was eventually restored, all the recent files people had been working on since the backup were lost. Obviously, the school did not pay any ransom, but only because they back up the system files twice a week; had they not have done – there would have been no way to restore the files without paying the ransom and getting the unlock code.

In the light of recent cyber attacks, in May 2017 –  Microsoft has come out and said this is a “wake-up call” and reiterates the need to install their security patches as, and when, they are released.

  • Ransomware is a type of malware that prevents or limits users from accessing their system, either by locking the system’s screen or by locking the users’ files unless a ransom is paid. More modern ransomware families, collectively categorized as crypto-ransomware, encrypt certain file types on infected systems and forces users to pay the ransom through certain online payment methods to get a decrypt key.https://www.trendmicro.co.uk/vinfo/uk/security/definition/ransomware

I am certainly not trying to imply that, had the user been using an updated version of Windows 10 that that would never have happened. Instead, I am trying to add to the discussion that the often overlooked threat to network security is internal human errors****. However, “User Behavioural Analytics” are beyond the scope of this discussion.

Summary

Keeping your system up to date with the latest security patches and software add-ons remains a highly important step in combating hackers.

In short —

INSTALL and UPDATE

  • Your Operating System
  • Your browser
  • Your browser add-ons
  • Anti-Virus software
  • Anti-Malware software
  • Anti-Spyware software
  • Firewall

·        Do NOT open unknown emails and attachments EVER!

Some people tend to think that if your device is set to download and install updates alongside a disk defragmentation automatically at the default time of 03:00AM, then that is enough to keep them safe if they turn their machine off before bed. Well,…are you saying you expect the device to wake up at 03:00 and turn itself on, connect – by itself – to the internet, download and install updates/patches/drivers/code then check your hard drive for errors – before turning itself off again and going back to sleep? I’m sorry but it doesn’t!


 

I hope this article has gone some way in helping you understand the importance of UPDATES. If it has…please LIKESHARE or FEEDBACK the post. Thank you.

About the Author, – Dr Richard Haddlesey is the founder and Webmaster of English Medieval Architecture in which he gained a Ph.D. in 2010 and holds Qualified Teacher Status relating to I.C.T. and Computer Science. Richard is a professional Web Developer and Digital Archaeologist and holds several degrees relating to this. He is passionate about the dissemination of research and advancement of digital education and Continued Professional Development #CPD. Driven by a desire to better prepare students for industry, Richard left mainstream teaching to focus on a career in tutoring I.T. professionals with real skills that matter.

#ttrIT #ttrcareerinIT #ttrLearnToCode

Visit his Blog and Website

Read more about Dr Richard Haddlesey BSc MSc PGCE PhD

Bibliography

*https://duo.com/resources/ebooks/the-2016-duo-trusted-access-report-microsoft-edition

**https://nakedsecurity.sophos.com/2016/09/02/patch-now-recent-ios-vulnerability-affects-macs-too/

***http://www.macworld.co.uk/how-to/mac-software/do-macs-get-viruses-do-macs-need-antivirus-software-3454926/

**** The Essential Guide to Behavior Analytics – www.balabit.com

Please follow and like us:

A future skills gap in the wake of the new Computer Science GCSE?

As a former Secondary School Teacher, I was part of the government’s move away from traditional Information Communications Technology (ICT) toward Computer Science as a GCSE.

The change has been profound and caught many teachers off-guard. Many older teachers of ICT could not easily make the transition to teaching computer science. Why? Well because it is now a science! A science based on computational thinking and the logical creation and analysis of algorithms and coded solutions. In simplistic terms… it’s out with Microsoft Office and in with Python IDE!

Computer Science then is a completely different course to ICT. Obviously there exists some latent crossover, but for the most part, it is a much more relevant science/industry-based qualification compared to the more business based ICT course. Much of what was ICT is now only a small part of the E-commerce side of Comp Sci. It has moved from learning how to use software – such as MS Office – to create documents and websites. It is now much more about how to build apps, programs and e-portfolios alongside maintaining computer systems, networks and cyber-security. As such, breaking down a problem and planning a sequenced plan or algorithm is now fundamental to the “art” of computational thinking.

 

My experience of teaching both ICT and Computer Science has taught me that not all students are capable of Computational Thinking and understanding algorithms. Not all can think sequentially and logically, many can only process freeform, nonlinear thoughts and can make little sense of a computer that can only do what it is told, in a specific order using a specific structured language or code.

This leads the teacher to have to focus more on trying to teach the students how to create algorithms and flowcharts and of course coding. There does exist many high-quality educational aids for learning to code –

·        https://code.org/learn

·        https://scratch.mit.edu/starter_projects/

·        http://www.alice.org/index.php

·        https://www.codecademy.com/learn

·        https://www.kodugamelab.com/

·        https://codecombat.com/

Students, in my experience, find it difficult to code effectively because of the strict syntax. Although PYTHON is very forgiving, it is exacting in its syntax – in other words, if it expects a colon or comma, then it MUST have a colon or a comma! – but why? “Well, it just does” can placate some students, but frustrate others. Trying to get the students to code effectively takes up a lot of teaching time at the expense of much of the theory. Most of the time we had to rely on students doing the theory for homework, which inevitably, was 50/50 hit and miss with many students not bothering. The ability to create a working solution to a problem almost always forms the basis of at least one of their final Controlled Assessment’s in which the student must plan, code and test a solution efficiently with no guided help from their teacher or peers. Because this is crucial to a good final grade, it is obvious that teaching and learning how to code and troubleshoot code is a classroom priority.

So, you may ask, why am I writing this blog? Well, because I believe that there will continue to be a skills gap when our present and future cohorts of GCSE Computer Science students leave school. I am convinced that they will certainly better equipped than their ICT qualified peers, however, with too much time given over to learning Python I think they will be lacking solid industry skills. Don’t get me wrong; I think their learning Python, Computational Thinking and Algorithms are a massive step forward in the right direction. However, they often lack the ability to translate the learning of Python into other “C” based languages and HTML, SQL, JavaScript etc. No matter how hard we try to drill the students on the importance of planning and writing algorithms that were not retro-engineered, they always wanted to code first and then try to make up a plan to fit the program.

Any way I digress… I am not trying to push a solution – after all, there is no single solution – I am just pointing out my observations in order to try and start a discussion on the future of the industry and whether others have noticed a skills gap in GCSE students?

I hope this article has gone some way in helping start a discussion on possible future skills gaps. If it has…please LIKE, SHARE or FEEDBACK the post. Thank you.

About the Author – Dr Richard Haddlesey is the founder and Webmaster of English Medieval Architecture in which he gained a Ph.D. in 2010 and holds Qualified Teacher Status relating to I.C.T. and Computer Science. Richard is a professional Web Developer and Digital Archaeologist and holds several degrees relating to this. He is passionate about the dissemination of research and advancement of digital education and Continued Professional Development #CPD. Driven by a desire to better prepare students for industry, Richard left mainstream teaching to focus on a career in tutoring I.T. professionals with real skills that matter. Thus, catering more to the individual learner’s needs relevant to their career pathway than the National Curriculum taught in schools is presently capable of.

#ttrIT #ttrcareerinIT #ttrLearnToCode

Please follow and like us:

Enjoy this blog? Please spread the word :)