Eliminating harmful digital technologies in universities: a guide

By dene.mullen, 20 December, 2022
View
Modern institutions are rife with tech that disenfranchises, dehumanises, excludes and even bullies students and teachers. It’s high time for a rethink, says Andy Farnell
Article type
Article
Main text

I was recently asked: “Which digital technologies could we get rid of in higher education?”

Some immediately spring to mind, such as the scourge of CCTV cameras and badge access systems, which are turning places of learning into high-security gulags. Or, at the behest of government bureaucrats, our draconian monitoring of student attendance like preschool infants. But these technologies, unwelcome and unnecessary as they are, do not capture the problem – which is that of equity.

Every part of an equitable university is accountable and responsive to its core stakeholders – students and teachers; those without whom the entire institution makes no sense. Within their activities each must be able to teach and learn as a fully human participant, to be genuinely heard, held in mind, have choice, agency, autonomy and equality of opportunity.

Since every aspect of teaching and learning is touched by technology, naming specific problem technologies for elimination is akin to asking which limb we ought to amputate – for a patient with a virus. So we must reframe the question. Technology can deliver cheap, fast, efficient, uniform, accountable and secure education. But systemisation carries a catastrophic cost that falls upon students and teachers. So, let us ask: what types of harm are linked to technologies so we might design and/or select better alternatives? How do we eliminate those products and services that cannot, or will not, perform desirable functions without attendant burdens?

Harm occurs when technologies divert equity away from key stakeholders toward powerful but marginal stakeholders, namely chancellors, trustees, directors, dignitaries, landlords, governments, industries, advertisers, sponsors, technology corporations, suppliers and publishers. Harms arise because these entities have become invested in pushing technologies that favour their products and interests into the education ecosystem.

Obviously, we can’t entertain the idea of removing all technologies from education, if only to dodge the pedant’s retort that we’d better burn all books and blackboards while we’re at it. Rather than looking for technical errors, let’s recognise that technologies are fraught with political and psychological shortcomings in their models, structures and behaviours, which lead to misuse.

As a brief summary, we wish to identify and eliminate systems that:

  • disenfranchise and disempower
  • dehumanise
  • discriminate and exclude
  • extract or seek rent
  • coerce and bully
  • mislead or manipulate

Disempowering technologies

People unable to “keep up” with technology are disempowered. Those seeking to disempower only need follow Mark Zuckerberg’s call to “move fast and break things”. For example, forced Windows updates threatened to obsolete millions of computers, until a huge backlash forced the company to back down. Touted as “security” improvements, the updates, for example to “reduce confusion”, just handed more control of the owner’s PC to Microsoft. By contrast, the latest releases of Linux happily run on much older computers without entitled seizure of the owner’s operational sovereignty.

Similarly, incompatibilities are suddenly introduced by vendors into newer software. Google famously discontinues services on which millions depend. Take a solemn stroll through the Google Graveyard and see if any headstones evoke a tear. University IT centres expose students to risks by choosing software from companies with poor track records for long-term stability, equal access and interoperability. Students suddenly find their education is “not supported”.

Systems that dehumanise

To dehumanise is to ignore or minimise individual identity, erode empathy and enforce uniformity. Since the 1990s, students have been “bums on seats”. Digital technology simultaneously connects people and puts distance between them, removing proximity and the rich reality of interpersonal communication that demands a higher level of respect. Dr Andrew Kimbrell terms this deflation of responsibility “cold evil”. As examples, “issue ticketing” used in customer support systems and “no-reply emails” (those infuriating emails that will not allow you to reply) both silence voices and stunt discourse and are typical dehumanising devices.

Increasing use is made of unaccountable algorithms to automatically shut people out of systems when their “behaviour is deemed suspicious”. People who deploy algorithms should be held personally responsible for the harms caused, as if they had acted by themselves – rather like dog owners. On the contrary, as Cathy O’Neil points out in her book, The Shame Machine: Who Profits in the New Age of Humiliation, blaming the victims of toxic IT systems for falling foul of invisible “policies” is the norm.

Systems of exclusion

The cashless canteen is as effective at starving students of food as overzealously locked down wi-fi and audiovisual equipment is at preventing lecturers from teaching. Exclusion begins with assumptions that are silently transformed into mandates. As a regular visiting professor, I make sure to pack a flask of coffee and lunchbox alongside my 4G wi-fi dongle and mini-AV projector. EduRoam, specifically designed to sidestep such parochialism, is often disabled. Universities are hostile places unless you’re part of the “in crowd”, and that needs to change.

Furthermore, as big tech monopolies take over education, access to essential services without “signing in” using Facebook, LinkedIn, Microsoft or Google accounts is getting harder. Those who don’t subscribe to any of those are locked out without alternative provision or apology. Blunt web censorship based on common keywords alienates research students investigating inconvenient subjects such as terrorism, rape, hacking or even birth control. We must re-examine the power to shape academic life accidentally handed to non-academic faculties such as ICT, security and compliance teams. Surely, censoring and monitoring technologies characteristic of police states have no place in institutions of free enquiry and exploration by intelligent adults.

Systems of extraction

Rent-seeking software such as survey tools that hold research data hostage until the student pays a “premium fee” are encouraged in universities that lack the skills to set up their own basic HTML forms. Data harvesting is performed by tools such as Turnitin, which requires students to sign over rights to their work, and single sign-in frameworks that leak browsing habits. Tracking, attention monitoring and targeted advertising is part of campus life.

Let us now follow France’s lead to reign in big tech – although simply banning extractive technologies in our places of education may be too harsh a step change. Alternatives require skills and education. Instead, let’s at least mandate choice, so that those who choose, and are able, to extricate are free to do so. Universities that force Google or Microsoft products should lose government backing for being nothing more than extensions of the US corporate estate.

Systems of coercion

Threats hardly seem appropriate for a progressive learning environment, but for decades I have taught inconsolably anxious students mortified by attendance reports, submission systems and other machinery that sends nagging notifications, not to mention spurious or false warnings. The more we automate the student experience the more brutal it becomes. Universities living in fear of losing their licences for UKVI Tier-4 must dial back their overcautious machinery. We must realise the impact on mental health of students who genuinely believe that a faulty algorithm may put them on a plane to Rwanda is not an acceptable price for over-compliance.

Common aims

Many inappropriate technologies blight higher education because we do not understand it. Solutionism, knee-jerk mentality and a penchant for cheap, quick, off-the-shelf fixes is rife. We lack a coherent, joined-up understanding of the trade-offs; psychological, political and pedagogical.

Change begins with raising the skill levels and issue awareness of strategic, policymaking and ICT staff, and generally improving the digital literacy of all academic staff, if we are to shrug off our unhealthy default fallback on convenient but inappropriate technologies. It is time to make the voices of the most important stakeholders – students and faculty – heard again and to remedy the profound dearth of equity in technology selection and procurement.

Andy Farnell is a British computer scientist specialising in signals and systems. He is a prolific speaker, visiting professor, consultant, ethical hacker, author and lifelong advocate for digital rights. He is teaching cybersecurity while writing a new book, Ethics for Hackers.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Standfirst
Modern institutions are rife with tech that disenfranchises, dehumanises, excludes and even bullies students and teachers. It’s high time for a rethink, says Andy Farnell

comment4

THE_comment

1 year 10 months ago

Reported
False
User Id
852377
User name
Tom Worthington
Comment body
Eliminating harmful digital technologies from universities is very simple: listen to your digital, and educational, technologists. I read with amusement, and annoyance, reports of university staff, policy makers and government officials having to react to new technology which arrives seemingly without warning. But this technology takes decades to develop, often in the computer research labs of universities. If university administrators and academics were to ask their own on campus experts (or come to their seminars), they could learn what will be "suddenly" arriving in the next few years. As an example, some at universities spent years refining how to use online learning. Some conducted drills to practice what to do in an emergency, such as a pandemic, which closed campuses. Rather than learn from this, many at universities tried to invent e-leaning, as if it was something new. This make things harder, bot for staff and students. https://blog.highereducationwhisperer.com/2020/04/responding-to-coronavirus-emergency.html
jwt token
eyJhbGciOiJIUzUxMiJ9.eyJuYW1lIjoiVG9tIFdvcnRoaW5ndG9uIiwiZW1haWwiOiJ0b20ud29ydGhpbmd0b25AdG9tdy5uZXQuYXUiLCJpZCI6Ijg1MjM3NyIsImlhdCI6MTY3MTYwNzM4OSwiZXhwIjoxNjc0MTk5Mzg5fQ.fanP5BiJAzQJrOMpd6tsPITZw1TE6Suock5yIoOJSynHgXVQWCpXPBCGOmtfpNdi-D0Xk2MPenyY9OsId-3u7g
Reviewed
On

THE_comment

1 year 10 months ago

Reported
False
User Id
3168804
User name
nevard
Comment body
This article hits the nail on the head. Universities (and all HE organisations for that matter) should be facilitating and reinventing technology that works for the people. It is for that reason that I am shocked (at least in my experience as a postgraduate cyber security student) that my university has always been at war with its IT team, introducing technology that makes lives more difficult, and burying their heads in the sand when it comes to feedback and alternative options from academics in the field. Universities should look internally at their own people more often, especially when it comes to policy creation. Only then will we move forward and eliminate disempowering technologies that dehumanise, exclude, extract, and coerce students and staff. This isn’t just an annoyance; harmful tech is prohibiting research and growth.
jwt token
eyJhbGciOiJIUzUxMiJ9.eyJuYW1lIjoibmV2YXJkIiwiZW1haWwiOiJlZHdhcmRAbmV2YXJkLmV1IiwiaWQiOiIzMTY4ODA0IiwiaWF0IjoxNjcxNzIyNDY1LCJleHAiOjE2NzQzMTQ0NjV9.OUdMXab7EbEY7WM385yMr5zMCsaQ0u_4Z9soMNYI1-bO27vouXIlQBZ5BjwWtEFMHS-fgYsDAAdUpLSLDrValQ
Reviewed
On

THE_comment

1 year 9 months ago

Reported
False
User Id
165412
User name
Hugh Fletcher
Comment body
An excellent example of all the above situations is the introduction of the People-soft/Oracle student information system (SIS). It was chosen by senior management. Its effect can be judged from a an article, as I recall, in The Glasgow Herald. “Registration Chaos for Thousands [headline]” as “A NEW £14 million “self-enrolment” service launched this year at one of Scotland’s oldest universities has left this year’s admissions system in chaos, according to staff and students” At QUB, three candidates were chosen to give presentation. Briefings were held when academics were likely to be teaching. I got to one, which was purpose designed for the UK. It sounded good. The preferred American product, also chosen at QUB and Oxford, seemed to have started life as a HR tool monitoring individual employees. At QUB we were told it would ALWAYS give correct information for individual students. The information was sometimes incomplete and in the beginning it was often entirely lacking. Marks for previous students were transferred from the old system, gaining errors in the process. Apparently the package had no native search function, and consisted basically of saved index cards. Courses and classes were foreign to its architecture It had been adapted for the American university system where marks were letters, and was now to be adapted to UK's numerical marks. The UK system was criticised for only calculating to 2 decimal places but the management decreed that all marks should be rounded up to whole numbers to increase marks, so that was not a valid complaint. The only successful implementation of the registration package was in Chemistry where the students wrote their registration requirements down and secretaries entered those details. By chance I shared a bus ride with a programmer who had worked on the SIS. His judgement was that there were so many layers of patches on patches that no-one at Oracle understood how it worked and it would have been better, quicker and cheaper to have started again from scratch, but middle and/or higher management would not listen. On that line, the QUB team running the old IT system proposed that they could write a purpose built SIS to specific requirements for around 1/50th of Glasgow's £14M. The art of selling software it to convince customers of what your programme could do, implying it can do it already, and hiding the add-on cost of the required modifications, and sell to directors who you can flatter, avoiding critical in-house IT experts.
jwt token
eyJhbGciOiJIUzUxMiJ9.eyJuYW1lIjoiSHVnaCBGbGV0Y2hlciIsImVtYWlsIjoiaHVnaC5mbGV0Y2hlckBudGx3b3JsZC5jb20iLCJpZCI6IjE2NTQxMiIsImlhdCI6MTY3NDAyODU5OSwiZXhwIjoxNjc2NjIwNTk5fQ.T8oFpqSnsrPB0sCU-ci1u9UJ8cgYBuzxM1h5sVp7LGPg4uESxPAluGjx2ZCCmxp1g9zdD6ufLLtc13occ55LzQ
Reviewed
On

THE_comment

1 year 3 months ago

Reported
False
User Id
3482510
User name
Niall Doch
Comment body
Although there is plenty to complain about in terms of poorly implimented IT systems, I think you lost me a bit from your comment about CCTV and card entry systems. At my institution there are numerous examples of equipment and personal possessions being stolen from rooms on campus. I know a colleague who had things stolen from their office when they stepped out for literally two minutes to go to the kitchen.
jwt token
eyJhbGciOiJIUzUxMiJ9.eyJuYW1lIjoiTmlhbGwgRG9jaCIsImVtYWlsIjoibmRqb2RvY2hlcnR5QGdtYWlsLmNvbSIsImlkIjoiMzQ4MjUxMCIsImlhdCI6MTY4OTg1NjgwNSwiZXhwIjoxNjkyNDQ4ODA1fQ.1CqlXVKuzH_ZvZ6bsMJo_NUEITZ2ooFkVXtTG3Ki9lhXFcuvQ2A9MUqanGp0QvUAb1JYKJMR6Io4pnJqnRracA
Reviewed
On