In one of my History of Art courses, I came across a theory known simply as ‘black box’. In science, computing and engineering, ‘black box theory’ refers to a device, a system or an object which can be understood in terms of what it does, but without any knowledge of how it does what it does. This provoked flashbacks to frantically hitting keys on most of the electronic devices I’ve ever owned, all the while exclaiming tearfully, ‘but why?!’.
The concept of ‘black box’ made me consider the sheer amount of technology that students collectively own, and the complete lack of understanding that often goes alongside it. Why do iPhones suddenly switch themselves off and refuse to cooperate? How does encryption work? Millennials are growing up in a society in which interactive technologies are increasingly normalised as part of everyday life, so how do so many of us still not know how it all works?
Ironically, our generation is seen by older adults as bafflingly tech-savvy. Marc Prensky refers to us as ‘digital natives’: people who have grown up with the ‘digital language’ of the Internet. To millennials, platforms such as Twitter and Tinder and online services are second nature, but considering the World Wide Web was only invented in 1990, it’s not surprising that there also exists the concept of ‘digital immigrants’: those who were not born into a digital culture, but adopted it (or not) along the way.
‘Digital natives’ are undoubtedly the ones driving the digital industry. By demanding faster, simpler user experiences, they trigger the invention of ever-smarter devices. This is all well and good, but when it comes to the increasingly competitive job market, it’s starting to feel like graduates-to-be are facing whole new challenges when it comes to their marketable skills. It’s not enough to simply own advanced technology; start-ups especially are now looking for applicants who know how to use complicated software and, more importantly, how to write it.
Coding has suddenly become a valuable knowledge, with coding centres popping up over the UK. CodeClan, for example, Scotland’s first digital skills academy, opened for business on Castle Terrace just two months ago. There are also programmes available online, many of them free. Suddenly, computer science is looking a lot more appealing.
When I was deciding between laptops, I consulted (i.e pestered) The Student’s Web Editor to avoid ending up with yet another device which would end in tears. In the process, I learnt what an Intel processor is and why Atom is a no-no. For a brief, shining moment, a gleaming career in technology opened up before my eyes. Two days later, I ended up in the laptop clinic begging them to reconnect me to Eduroam. Simple understanding of core processors does not a techie make.
However, consider the fact that technology is designed by humans to extend human capabilities. Learning code is a way of understanding what’s behind the screen; a way to crack the ‘black box’. Mastery of software and tech is no longer a niche subject, but is becoming more inclusive, more accessible and much more desirable.
Illustration: Vivian Uhlir