Service Model: a review

A robot butler outline on a blood red background
Image by OpenClipart-Vectors from Pixabay

Artificial Intelligences are just tools, that do nothing but follow their programming. They are not self-aware and have no ability for self-determination. They are a what not a who. So what is it like to be a robot just following its (complex) program, making decisions based on data alone? What is it like to be an artificial intelligence? What is the real difference between being self-aware and not? What is the difference to being human? These are the themes explored by the dystopian (or is it utopian?) and funny science fiction novel “Service Model” by Adrian Tchaikovsky.

In a future where the tools of computer science and robotics have been used to make human lives as comfortable as conceivably possible, Charles(TM) is a valet robot looking after his Master’s every whim. His every action is controlled by a task list turned into sophisticated human facing interaction. Charles is designed to be totally logical but also totally loyal. What could go wrong? Everything it turns out when he apparently murders his master. Why did it happen? Did he actually do it? Is there a bug in his program? Has he been infected by a virus? Was he being controlled by others as part of an uprising? Has he become self-aware and able to made his own decision to turn on his evil master. And that should he do now? Will his task list continue to guide him once he is in a totally alien context he was never designed for, and where those around him are apparently being illogical?

The novel explores important topics we all need to grapple with, in a fun but serious way. It looks at what AI tools are for and the difference between a tool and a person even when doing the same jobs. Is it actually good to replace the work of humans with programs just because we can? Who actually benefits and who suffers? AI is being promoted as a silver bullet that will solve our economic problems. But, we have been replacing humans with computers for decades now based on that promise, but prices still go up and inequality seems to do nothing but rise with ever more children living in poverty. Who is actually benefiting? A small number of billionaires certainly are. Is everyone? We have many better “toys” that superficially make life easier and more comfortable – we can buy anything we want from the comfort of our sofas, self-driving cars will soon take us anywhere we want, we can get answers to any question we care to ask, ever more routine jobs are done by machines, many areas of work, boring or otherwise are becoming a thing of the past with a promise of utopia, but are we solving problems or making them with our drive to automate everything. Is it good for society as a whole or just good for vested interests? Are we losing track of what is most important about being human? Charles will perhaps help us find out.

Thinking about the consequences of technology is an important part of any computer science education and all CS professionals should think about ethics of what they are involved in. Reading great science fiction such as this is one good way to explore the consequences, though as Ursula Le Guin has said: the best science fiction doesn’t predict the future, it tells us about ourselves in the present. Following in the tradition of “The Machine Stops” and “I, Robot”, “Service Model” (and the short story “Human Resources” that comes with it) does that, if in a satyrical way. It is a must read for anyone involved in the design of AI tools especially those promoting the idea of utopian futures.

Paul Curzon, Queen Mary University of London

More on …

Subscribe to be notified whenever we publish a new post to the CS4FN blog.


This page is funded by EPSRC on research agreement EP/W033615/1.

QMUL CS4FN EPSRC logos

Crystal ball coupons – what your data might be giving away

Big companies know far more about you than you think. You have very little privacy from their all-seeing algorithms. They may even have worked out some very, very personal things about you, that even your parents don’t know…

An outraged father in Minneapolis stormed into a supermarket chain complaining that his school-aged daughter was being sent coupons for baby clothes. The shop manager apologised … but later they found there was no mistake in the tiny tot offers. The teenager was expecting a baby but had not told her father. Her situation was revealed not by a crystal ball but by an algorithm. The shop was using Big Data processing algorithms that noticed patterns in her shopping that they had linked to “pregnant”. They had even worked out her likely delivery date. Her buying habits had triggered targeted marketing.

Algorithms linked her shopping patterns to “pregnant”

When we use a loyalty card or an online account our sales activity is recorded. This data is added to a big database, with our details, the time, date, location and products bought (or browsed). It is then analysed. Patterns in behaviour can be tracked, our habits, likes, dislikes and even changes in our personal situation deduced, based on those patterns. Sometimes this seems quite useful, other times a bit annoying, it can surprise us, and it can be wrong.

This kind of computing is not just used to sell products, it is also used to detect fraud and to predict where the next outbreak of flu will happen. Our banking behaviour is tracked to flag suspicious transactions and help stop theft and money laundering. When we search for ‘high temperature’ our activity might be added to the data used to predict flu trends. However, the models are not always right as there can be a lot of ‘noise’ in the data. Maybe we bought baby clothes as a present for our aunt, and were googling temperatures because we wanted to go somewhere hot for our holiday.

Whether the predictions are spot on or not is perhaps not the most important thing. Maybe we should be considering whether we want our data saved, mined and used in these ways. A predictive pregnancy algorithm seems like an invasion of privacy, even like spying, especially if we don’t know about it. Predictive analytics is big; big data is really big and big business wants our data to make big profits. Think before you click!

Jane Waite, Queen Mary University of London (now at Raspberry Pi)

More on …


Subscribe to be notified whenever we publish a new post to the CS4FN blog.


This page is funded by EPSRC on research agreement EP/W033615/1.

QMUL CS4FN EPSRC logos