Forty years ago I “fixed” my first computer. My teachers gave me the responsibility of “helping” connect two computers that wouldn’t talk. Pushed into installing classroom computers under ambitious digital literacy policy, they were out of their depth. Unable to understand why something seemed “broken”, they naturally turned to the support of the nearest 12-year-old. Flattered, I got to be the smart kid who solved problems that confounded adults. And I am grateful; it set me off as a computer scientist.
The computers – a TRS-80 made by the US Tandy Corporation and a BBC Model-B manufactured by Acorn in the UK – ran a common programming language that made them, in theory, “interoperable”. Although both computers spoke BASIC (Beginners’ All-purpose Symbolic Instruction Code) and had standard 5.25-inch floppy disk drives, my challenge in 1982 was that neither could read each other’s data format.
Back then we called the problem “incompatibility”. But there was no technical reason for this – indeed, within a few years, that specific problem vanished as computers settled on a standard disk format.
- How do we get more women into coding? Fix how it’s taught and used
- Failing fast: what universities need to consider when adopting edtech
- The metaverse will change everything – including academic research
Such interoperable standards are how we overcome incompatibility, so devices manufactured by different companies can work together. Indeed, in many ways, modern civilisation rests on standards. Sizes of nuts and bolts, domestic electrical voltages, disk diameters such as CDs and LPs are all standardised. It spurs opportunity, innovation and reduces waste. Similarly, standard programming languages let us teach one way of creating applications to run on different computers.
But, today, computing is still fundamentally broken for the same reasons it was in 1982. Users of WhatsApp cannot talk to people on Telegram or Signal. Google Office users cannot transfer their workspace to Microsoft365.
Why, with no good technical reason, does computing remain broken at the data-exchange level 40 years on? Making matters worse, support is either unavailable or actively discouraged today – even if you can find a tenacious kid with a soldering iron and programming skills.
Incompatibility is a deliberate, and indeed legally sanctioned, ploy by big companies to divide the world into markets. In an age of extraordinary non-interoperability we have wrongly come to accept digital silos as a natural feature of technology.
Yet in education we value communicative plurality. That makes us especially sensitive to broken systems. We are increasingly split into camps: “They use Teams; we use Zoom. Can’t talk.” Or: “We use Google; they use Microsoft. Can’t collaborate.”
Since 2010, the mainstream tech world has been fragmenting. The term “splinternet” describes a departure from the standards on which internet services such as the web and email were built. Under new feudal systems we are forced to pick sides and swear fealty to the fiefdoms of Big Tech. Recently, a headmaster proudly told me how his school is a “Google Academy”.
This means lower-quality research and teaching. It obstructs research when we can’t exchange data formats. It creates power games between students, faculty and ICT teams over whose preference will triumph. It locks in content so it can’t be updated or moved: “Sorry, the slides are in PowerPoint 1.0 format and can’t be edited with the new system.” Most of all, it excludes students who can’t, or for good ethical reasons won’t, go along with the Big Tech takeover of their schools.
Lack of interoperability goes against the ethos of the internet, against education as shared public good, against principles of verification, peer review and reproducibility in scientific method and against the norms of free and open dissemination for teaching. At a fundamental level, higher education is incompatible with Big Tech.
Fortunately, though, universities are full of smart people who can solve these problems, right? Wrong. The digital literacy of the average tech user has plummeted since the 1980s. Even computer science professors treat their smartphones as magic boxes – and, even if they are knowledgeable, “security” and other “policy” is set up to thwart them.
Widespread de-skilling of university ICT staff is partly to blame. Instead of supporting staff and students we outsource to so-called cloud services. These provide scant support and are often run by monopolists known for bribery and aggressive product lock-in. The phrase “not supported” now legitimises all manner of exclusion and responsibility shrugging.
But help is at hand. Europe, through drives such as the European Interoperability Framework (EIF), has long championed user self-determination in software choice, bring-your-own (BYO) and free open-source solutions.
Thus far, these remain soft recommendations. So, while Germany’s Schleswig-Holstein province switches all administration and schools over to interoperable, open-source software such as LibreOffice, Microsoft retains “preferred supplier” status in the UK.
The proposed Digital Markets Act would toughen things up, forcing tech companies with “gatekeeper status” to adopt standards and make their products open to interoperate.
The impact of the proposed act would be a boon to education at many levels. ICT departments would be pressured to support all staff and students without prejudice, or at least it would stop them actively preventing interoperability under the guise of “policy”. It will put IT monocultures, de-skilling and overuse of cloud services back on the policy agenda, and it will drag latent issues cloaked in “not supported”-style language out into the sunlight.
Post-Brexit UK isolationism may slow advances toward interoperability and diversity, but the political wind is blowing the other way. A Declaration for the Future of the Internet (DFI) signed by nearly 60 countries in April will surely boost open standards, while a UK probe into "improper lobbying" by All-Party Parliamentary Groups (APPGs) may quash the march of Microsoft as much as Chinese influencers. And as the EU rolls out its own Twitter replacement based on Mastodon, it seems the days of tech monocultures and exceptionalism are numbered.
Given a time machine, maybe I’d go back to 1982 and tell my teachers: “No, you fix it.” As responsible, professional adults, had they returned the computers as defective, maybe we wouldn’t be in this mess today. Non-interoperable systems were unacceptable in 1982, and they are doubly so in 2022, so let’s all stand up firmly against the descent of universities into brand silos by supporting the proposed act.
Andy Farnell is a British computer scientist specialising in signals and systems. He is a prolific speaker, visiting professor, consultant, ethical hacker, author and lifelong advocate for digital rights. He is teaching cybersecurity while writing a new book, Ethics for Hackers.
If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.