You can visit our sister site, Teaching London Computing, to access our free colouring-in puzzles and activities.
This blog is funded through EPSRC grant EP/W033615/1.
Computer Science for Fun
by Jo Brodie, Queen Mary University of London
In 2009 Desi Cryer, who is Black, shared a light-hearted video with a serious message. He’d bought a new computer with a face tracking camera… which didn’t track his face, at all. It did track his White colleague Wanda’s face though. In the video (below) he asked her to go in front of the camera and move from side to side and the camera obediently tracked her face – wherever she moved the camera followed. When Desi moved back in front of the camera it stopped again. He wondered if the computer might be racist…
Another video (below), this time from 2017, showed a dark-skinned man failing to get a soap to dispenser to give him some soap. Nothing happened when he put his hand underneath the sensor but as soon as his lighter-skinned friend put his hand under it – out popped some soap! The only way the first man could get any soap dispensed was to put a white tissue on his hand first. He wondered if the soap dispenser might be racist…
Probably no-one set out to maliciously design a racist device but designers might need to check that their products work with a range of different people before putting them on the market. This can save the company embarrassment as well as creating something that more people want to buy.
Both devices use a sensor that is activated (or in these cases isn’t) by a signal. Soap dispensers shine a beam of light which bounces off a hand placed below it and some of that light is reflected back. Paler skin reflects more light (and so triggers the sensor) than darker skin. Next to the light is a sensor which responds to the reflected light – but if the device was only tested on White people then the sensor wasn’t adjusted for the full range of skin tones and so won’t respond appropriately. Similarly cameras have historically been designed for White skin tones meaning darker tones are not picked up as well.
In the days when film was developed the technicians would use what was called a ‘Shirley’ card (a photograph of a White woman with brown hair) to colour-correct the photographs. The colour balancing meant darker-skinned tones didn’t come out as well, however the problem was only really addressed because chocolate manufacturers and furniture companies complained that the different chocolates and dark brown wood products weren’t showing up correctly!
The Racial Bias Built Into Photography (25 April 2019) The New York Times
It’s a good idea, when designing something that will be used by lots of different people, to make sure that it will work correctly with everyone. Having a diverse design team and, importantly, making sure that everyone feels empowered to contribute is a good way to start. Another is to test the design with different target audiences early in the design process so that changes can be made before it’s too late. How a company responds to feedback when they’ve made an oversight is also important. In the case of the computer company they acknowledged the problem and went to work to improve the camera’s sensitivity.
During the coronavirus pandemic many people bought a ‘pulse oximeter’, a device which clips painlessly onto a finger and measures how much oxygen is circulating in your blood (and your pulse). If the oxygen reading became too low people were advised to go to hospital. Oximeters shine red and infrared light from the top clip through the finger and the light is absorbed diferently depending on how much oxygen is present in the blood. A sensor on the lower clip measures how much light has got through but the reading can be affected by skin colour (and coloured nail polish). People were concerned that pulse oximeters would overestimate the oxygen reading for someone with darker skin (that is, tell them they had more oxygen than they actually had) and that the devices might not detect a drop in oxygen quickly enough to warn them.
In response the UK Government announced in August 2022 that it would investigate this bias in a range of medical devices to ensure that future devices work effectively for everyone.
See also Is your healthcare algorithm racist? (from issue 27 of the CS4FN magazine).
We have free posters to download and some information about the different people who’ve helped make modern computing what it is today.
Or click here: Celebrating diversity in computing
This blog is funded through EPSRC grant EP/W033615/1.
by Jo Brodie and Paul Curzon
How the use of facial recognition technology caused the wrong Black man to be arrested.
The police were waiting for Robert Williams when he returned home from work in Detroit, Michigan. They arrested him for robbery in front of his wife and terrified daughters aged two and five and took him to a detention centre where he was kept overnight. During his interview an officer showed him two grainy CCTV photos of a suspect alongside a photo of Williams from his driving licence. All the photos showed a large Black man, but that’s where the similarity ended – it wasn’t Williams on CCTV but a completely different man. Williams held up the photos to his face and said “I hope you don’t think all Black people look alike”, the officer replied that “the computer must have got it wrong.”
William’s problems began several months before his arrest when video clips and images of the robbery from the CCTV camera were run through face recognition software used by the Detroit Police Department. The system has access to the photos from everyone’s driving licence and can compare different faces until it finds a potential match and in this case it falsely identified Robert Williams. No system is ever perfect but studies have shown that face recognition technology is often better at correctly matching lighter skinned faces than darker skinned ones.
The way facial recognition works is not actually by comparing pictures but by comparing data. When a picture of a face is added to the system, essentially lots of measurements are taken such as how far apart the eyes are, or what the shape of the nose is. That signature is added to the database. When looking for a match from say a CCTV image, the signature of the new image is first determined. Then algorithms look for the signature in the database “nearest” to the one in the database. This gives a signature for each face made up of all the numbers. How well it works depends on the particular features chosen, amongst many other things. If the features chosen are a poor way to distinguish particular groups of people then there will be lots of bad matches. But how does it decide what is “nearest” anyway given in essence it is just comparing groups of numbers? More algorithms are used based on, for example, machine learning. The system might be trained on lots of faces and told which match and which don’t, allowing it to look for patterns that are good ways to predict matches. If, however, it is trained on mainly light-skinned faces it is likely to be bad at spotting matches for faces of other ethnic backgrounds. It may actually decide that “all black people look alike”.
However face recognition is only part of the story. A potential match is only a pointer towards someone who might be a suspect and it’s certainly not a ‘case closed’ conclusion – there’s still work to be done to check and confirm. But as Williams’ lawyer, Victoria Burton-Harris, pointed out once the computer had suggested Williams as a suspect that “framed and informed everything that officers did subsequently”. The man in the CCTV image wore a red baseball cap. It was for a team that Williams didn’t support (he’s not even a baseball fan) but no-one asked him about it. They also didn’t ask if he was in the area at the time (he wasn’t) or had an alibi (he did). Instead the investigators asked a security guard at the shop where the theft took place to look at some photos of possible suspects and he picked Williams from the line-up of images. Unfortunately the guard hadn’t been on duty on the day of the theft and had only seen the CCTV footage.
Robert Williams spent 30 hours in custody for a crime he didn’t commit after his face was mistakenly selected from a database. He was eventually released and the case dropped but his arrest is still on record along with his ‘mugshot’, fingerprints and a DNA sample. In other words he was wrongly picked from one database and has now been unfairly added to another. The experience for his whole family has been very traumatic and sadly his children’s first encounter with the police has been a distressing rather than a helpful one.
The American Civil Liberties Union (ACLU) has filed a lawsuit against the Detroit Police Department on Williams’ behalf for his wrongful arrest. It is not known how many people have been arrested because of face recognition technology but given how widely it is used it’s likely that others will have been misidentified too. The ACLU and Williams have asked for a public apology, for his police record to be cleared and for his images to be removed from any face recognition database. They have also asked that the Detroit Police Department stop using facial recognition in their investigations. If Robert Williams had lived in New Hampshire he’d never have been arrested as there is a law there which prevents face recognition software from being linked with driving license databases.
In June 2020 Amazon, Microsoft and IBM denied the police any further access to their face recognition technology and IBM has also said that it will no longer work in this area because of concerns about racial profiling (targeting a person based on assumptions about their race instead of their individual actions) and violation of privacy and human rights. Campaigners are asking for a new law that protects people if this technology is used in future. But the ACLU and Robert Williams are asking for people to just stop using it – “I don’t want my daughters’ faces to be part of some government database. I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like.”
Technology is only as good as the data and the algorithms it is based on. However, that isn’t the whole story even if very accurate, it is only as good as the way it is used. If as a society our aim is to protect people from bad things happening, perhaps some technologies should not be used at all.
This article was originally published on the Teaching London Computing website where you can find references and further reading.
One of the aims of our Diversity in Computing posters (see below) is to help a classroom of young people see the range of computer scientists which includes people who look like them and people who don’t look like them. You can download our posters free from the link below.
We have free posters to download and some information about the different people who’ve helped make modern computing what it is today.
Or click here: Celebrating diversity in computing
This blog is funded through EPSRC grant EP/W033615/1.
by Paul Curzon, Queen Mary University of London
NASA Langley was the birthplace of the U.S. space program where astronauts like Neil Armstrong learned to land on the moon. Everyone knows the names of astronauts, but behind the scenes a group of African-American women were vital to the space program: Katherine Johnson, Mary Jackson and Dorothy Vaughan. Before electronic computers were invented ‘computers’ were just people who did calculations and that’s where they started out, as part of a segregated team of mathematicians. Dorothy Vaughan became the first African-American woman to supervise staff there and helped make the transition from human to electronic computers by teaching herself and her staff how to program in the early programming language, FORTRAN.
The women switched from being the computers to programming them. These hidden women helped put the first American, John Glenn, in orbit, and over many years worked on calculations like the trajectories of spacecraft and their launch windows (the small period of time when a rocket must be launched if it is to get to its target). These complex calculations had to be correct. If they got them wrong, the mistakes could ruin a mission, putting the lives of the astronauts at risk. Get them right, as they did, and the result was a giant leap for humankind.
See the film ‘Hidden Figures’ for more of their story (trailer below).
This story was originally published on the CS4FN website and was also published in issue 23, The Women Are (Still) Here, on p21 (see ‘Related magazine’ below).
We have free posters to download and some information about the different people who’ve helped make modern computing what it is today.
Or click here: Celebrating diversity in computing
This blog is funded through EPSRC grant EP/W033615/1.
by Paul Curzon, Queen Mary University of London
Back in 1956, Clarence Ellis started his career at the very bottom of the computer industry. He was given a job, at the age of 15, as a “computer operator” … because he was the only applicant. He was also told that under no circumstances should he touch the computer! Its lucky for all of us he got the job, though! He went on to develop ideas that have made computers easier for everyone to use. Working at a computer was once a lonely endeavour: one person, on one computer, doing one job. Clarence Ellis changed that. He pioneered ways for people to use computers together effectively.
The company Clarence first worked for had a new computer. Just like all computers back then, it was the size of a room. He worked the graveyard shift and his duties were more those of a nightwatchman than a computer operator. It could have been a dead-end job, but it gave him lots of spare time and, more importantly, access to all the computer’s manuals … so he read them … over and over again. He didn’t need to touch the computer to learn how to use it!
His studying paid dividends. Only a few months after he started, the company had a potential disaster on its hands. They ran out of punch cards. Back then punch cards were used to store both data. They used patterns of holes and non-holes as a way to store numbers as binary in a away a computer could read them. Without punchcards the computer could not work!
It had to though, because the payroll program had to run before the night was out. If it didn’t then no-one would be paid that month. Because he had studied the manuals in detail, and more so than anyone else, Clarence was the only person who could work out how to reuse old punch cards. The problem was that the computer used a system called ‘parity checking’ to spot mistakes. In its simplest form parity checking of a punch card involves adding an extra binary digit (an extra hole or no-hole) on the end of each number. This is done in a way that ensures that the number of holes is even. If there is an even number of holes already, the extra digit is left as a non-hole. If, on the other hand there is an odd number of holes, a hole is punched as the extra digit. That extra binary digit isn’t part of the number. It’s just there so the computer can check if the number has been corrupted. If a hole was accidentally or otherwise turned into a non-hole (or vice versa), then this would show up. It would mean there was now an odd number of holes. Special circuitry in the computer would spot this and spit out the card, rejecting it. Clarence knew how to switch that circuitry off. That meant they could change the numbers on the cards by adding new holes without them being rejected.
After that success he was allowed to become a real operator and was relied on to troubleshoot whenever there were problems. His career was up and running.
He later worked at Xerox Parc, a massively influential research centre. He was part of the team that invented graphical user interfaces (GUIs). With GUIs Xerox Parc completely transformed the way we used computers. Instead of typing obscure and hard to remember commands, they introduced the now standard ideas, of windows, icons, dragging and dropping, using a mouse, and more. Clarence, himself, has been credited with inventing the idea of clicking on an icon to run a program.
As if that wasn’t enough of an impact, he went on to help make groupware a reality: software that supports people working together. His focus was on software that let people write a document together. With Simon Gibbs he developed a crucial algorithm called Operational Transformation. It allows people to edit the same document at the same time without it becoming hopelessly muddled. This is actually very challenging. You have to ensure that two (or more) people can change the text at exactly the same time, and even at the same place, without each ending up with a different version of the document.
The actual document sits on a server computer. It must make sure that its copy is always the same as the ones everyone is individually editing. When people type changes into their local copy, the master is sent messages informing it of the actions they performed. The trouble is the order that those messages arrive can change what happens. Clarence’s operational transformation algorithm solved this by changing the commands from each person into ones that work consistently whatever order they are applied. It is the transformed operation that is the one that is applied to the master. That master version is the version everyone then sees as their local copy. Ultimately everyone sees the same version. This algorithm is at the core of programs like Google Docs that have ensured collaborative editing of documents is now commonplace.
Clarence Ellis started his career with a lonely job. By the end of his career he had helped ensure that writing on a computer at least no longer needs to be a lonely affair.
This article was originally published on the CS4FN website. One of the aims of our Diversity in Computing posters (see below) is to help a classroom of young people see the range of computer scientists which includes people who look like them and people who don’t look like them. You can download our posters free from the link below.
We have free posters to download and some information about the different people who’ve helped make modern computing what it is today.
Or click here: Celebrating diversity in computing
This blog is funded through EPSRC grant EP/W033615/1.
by Jo Brodie and Paul Curzon, Queen Mary University of London
As a baby, born in the US in 1989, Freddie Figgers was abandoned by his biological parents but he was brought up with love and kindness by two much older adoptive parents who kindled his early enthusiasm for fixing things and inspired his work in smart health. He now runs the first Black-owned telecommunications company in the US.
When Freddie was 9 his father bought him an old (broken) computer from a charity shop to play around with. He’d previously enjoyed tinkering with his father’s collection of radios and alarm clocks and when he opened up the computer could see which of its components and soldering links were broken. He spotted that he could replace these with the same kinds of components from one of his dad’s old radios and, after several attempts, soon his computer was working again – Freddie was hooked, and he started to learn how to code.
When he was 12 he attended an after-school club and set to work fixing the school’s broken computers. His skill impressed the club’s leader, who also happened to be the local Mayor, and soon Freddie was being paid several dollars an hour to repair even more computers for the Mayor’s office (in the city of Quincy, Florida) and her staff. A few years later Quincy needed a new system to ensure that everyone’s water pressure was correct. A company offered to create software to monitor the water pressure gauges and said it would cost 600,000 dollars. Freddie, now 15 and still working with the Mayor, offered to create a low-cost program of his own and he saved the city thousands in doing so.
He was soon offered other contracts and used the money coming in to set up his own computing business. He heard about an insurance company in another US city whose offices had been badly damaged by a tornado and lost all of their customers’ records. That gave him the idea to set up a cloud computing service (which means that the data are stored in different places and if one is damaged the data can easily be recovered from the others).
His father, now quite elderly, had dementia and regularly wandered off and got lost. Freddie found an ingenious way to help him by rigging up one of his dad’s shoes with a GPS detector and two-way communication connected to his computer – he could talk to his dad through the shoe! If his dad was missing Freddie could talk to him, find out where he was and go and get him. Freddie later sold his shoe tracker for over 2 million dollars.
Living in a rural area he knew that mobile phone coverage and access to the internet was not as good as in larger cities. Big telecommunications companies are not keen to invest their money and equipment in areas with much smaller populations so instead Freddie decided to set up his own. It took him quite a few applications to the FCC (the US’ Federal Communications Commission who regulate internet and phone providers) but eventually, at 21, he was both the youngest and the first Black person in the US to own a telecoms company.
Most telecoms companies just provide a network service but his company also creates affordable smart phones which have ‘multi-user profiles’ (meaning that phones can be shared by several people in a family, each with their own profile). The death of his mother’s uncle, from a diabetic coma, also inspired him to create a networked blood glucose (sugar) meter that can link up wirelessly to any mobile phone. This not only lets someone share their blood glucose measurements with their healthcare team, but also with close family members who can help keep them safe while their glucose levels are too high.
Freddie has created many tools to help people in different ways through his work in health and communications – he’s even helping the next generation too. He’s also created a ‘Hidden Figgers’ scholarship to encourage young people in the US to take up tech careers, so perhaps we’ll see a few more fantastic folk like Freddie Figgers in the future.
This article was originally published on our sister website at Teaching London Computing (which has lots of free resources for computing teachers). It hasn’t yet been published in an issue of CS4FN but you can download all of our free magazines here.
We have free posters to download and some information about the different people who’ve helped make modern computing what it is today.
Or click here: Celebrating diversity in computing
A ‘shoe tech’ device for people who have no sense of that direction – read about it in ‘Follow that Shoe’ on the last page of the wearable technology issue of CS4FN (‘Technology Worn Out (And About)’, issue 25).
Right to Repair – a European movement to make it easier for people to repair their devices, or even just change the battery in a smartphone themselves. See also the London-based Restart Project which is arguing for the same in the UK.
This blog is funded through EPSRC grant EP/W033615/1.
by Paul Curzon, Queen Mary University of London
Satellites are critical to much modern technology, and especially GPS. It allows our smartphones, laptops and cars to work out their exact position on the surface of the earth. This is central to all mobile technology, wearable or not, that relies on knowing where you are, from plotting a route your nearest Indian restaurant to telling you where a person you might want to meet is. Many, many people were involved in creating GPS, but it was only in Black History Month of 2017 when the critical part Gladys West played became widely known.
As a child Gladys worked with her family in the fields of their farm in rural Virginia. That wasn’t the life she wanted, so she worked hard through school, leaving as the top student. She won a scholarship to university, and then landed a job as a mathematician at a US navy base.
There she solved the maths problems behind the positioning of satellites. She worked closely with the programmers to write the code to do calculations based on her maths. Nine times out of ten the results that came back weren’t exactly right so much of her time was spent working out what was going wrong with the programs, as it was vital the results were very accurate.
Her work on the Seasat satellite won her a commendation. It was a revolutionary satellite designed to remotely monitor the oceans. It collected data about things like temperature, wind speed and wind direction at the sea’s surface, the heights of waves, as well as sensing data about sea ice. This kind of remote sensing has since had a massive impact on our understanding of climate change. Gladys specifically worked on the satellite’s altimeter. It was a radar-based sensor that allowed Seasat to measure its precise distance from the surface of the ocean below. She continued this work on later remote sensing satellites too, including Geosat, a later earth observation satellite.
Knowing the positions of satellites is the foundation for GPS. The way GPS works is that our mobile receivers pick up a timed signal from several different satellites. Calculating where we are can only be done if you first know very precisely where those satellites were when they sent the signal. That is what Gladys’ work provided.
You can now buy, for example, buy GPS watches, allowing you to wear a watch that watches where you are. They can also be used by people with dementia, who have bad memory problems, allowing their carers to find them if they go out on their own but are then confused about where they are. They also allow parents to know where their kids are all the time. Do you think that’s a good use?
Since so much technology now relies on knowing exactly where we are, Gladys’ work has had a massive impact on all our lives.
This article was originally published on the CS4FN website and a copy can also be found on page 14 of Issue 25 of CS4FN, “Technology worn out (and about)“, on wearable computing, which can be downloaded as a PDF, along with all our other free material, here: https://cs4fndownloads.wordpress.com/
This article is also republished during Black History Month and is part of our Diversity in Computing series, celebrating the different people working in computer science (Gladys West’s page).
This blog is funded through EPSRC grant EP/W033615/1.
by Paul Curzon, Queen Mary University of London
To be a good computer scientist you have to enjoy problem solving. That is what it’s all about: working out the best way to do things. You also have to be able to think in a logical way: be a bit of a Vulcan. But what does that mean? It just means being able to think precisely, extracting all the knowledge possible from a situation just by pure reasoning. It’s about being able to say what is definitely the case given what is already known…and it’s fun to do. That’s why there is a Suduko craze going on as I write. Suduko are just pure logical thinking puzzles. Personally I like Kakuro better. They are similar to Soduko, but with a crossword format.
A Kakuro is a crossword-like grid, but where each square has to be filled in with a digit from 1-9 not a letter. Each horizontal or vertical block of digits must add up to the number given to the left or above, respectively. All the digits in each such block must be different. That part is similar to Soduko, though unlike Soduko, numbers can be repeated on a line as long as they are in different blocks. Also, unlike Soduko, you aren’t given any starting numbers, just a blank grid.
Where does logic come into it? Take the following fragment:
There is a horizontal block of two cells that must add up to 16. Ways that could be done using digits 1-9 are 9+7, 8+8 or 7+9. But it can’t be 8+8 as that needs two 8s in a block which is not allowed so we are left with just two possibilities: 9+7 or 7+9. Now look at the vertical blocks. One of them consists of two cells that add up to 17. That can only be 9+8 or 8+9. That doesn’t seem to have got us very far as we still don’t know any numbers for sure. But now think about the top corner. We know from across that it is definiteley 9 or 7 and from down that it is definitely 9 or 8. That means it must be 9 as that is the only way to satisfy both restrictions.
Here is a full Kakuro to try. There is also a printer friendly pdf version. Check your answer at the very end of this post when you are done.
Being able to think logically is important because computer programming is about coming up with precise solutions that even a dumb computer can follow. To do that you have to make sure all the possibilities have been covered. Reasoning very much like in a Kakuro is needed to convince yourself and others that a program does do what it is supposed to.
This article was included on Day 11 (The proof of the pudding… mathematical proof) of the CS4FN Advent Calendar in December 2021. Before that it was originally published on CS4FN and can also be found on page 16 of CS4FN Issue 3, which you can download as a PDF below. All of our free material can be downloaded here: https://cs4fndownloads.wordpress.com/
This blog is funded through EPSRC grant EP/W033615/1.
by Paul Curzon, Queen Mary University of London
A gentoo penguin slumps belly-first on a nest at Damoy, on the Antarctic Peninsula. Nearby some lichen grows across a rock, and schools of krill float through the Southern Ocean. Every one of these organisms is a part of life in the Antarctic, and scientitsts study each of them. But what happens to one species affects all the others too. To help make sure that they all survive, scientists have to understand how penguins, plants, krill and everything else in the Antarctic interact with one another. They need to figure out the rules of the ecosystem.
When you’re trying to understand a system that includes everything from plants to penguins, things get a bit complicated. Fortunately, ecology has a new tool to help, called complexity theory. Anje-Margriet Neutel is a Biosphere Complexity Analyst for the British Antarctic Survey. It’s her job to take a big puzzle like the Antarctic ecosystem, and work out where each plant and animal fits in. She explains that ‘complexity is sort of a new brand of science’. Lots of science is about isolating something – say, a particular chemical – from its surroundings so you can learn about it, but when you isolate all the parts of a system you miss how they work together. What complexity tries to do is build a model that can show all the important interactions in an ecosystem at the same time.
So for a system as big as a continent full of species, where do you start? Anje’s got a sensible answer: you start with what you can measure. Energy’s a good candidate. After all, every organism needs energy to stay alive, and staying alive is pretty much the first thing any plant or animal needs to do. So if you can track energy and watch it move through the ecosystem, you’ll learn a lot about how things work. You’ll find out what comes into the system, what goes out and what gets recycled.
Once you’ve got an idea of how everything fits together you’ve got what scientists call a model. The really clever thing you can do with models is start to mess around with them. As an example Anje says, ‘What would happen if you took one group of organisms and put in twice as much of them?’ If you had a system with, say, twice as many penguins, the krill would have to be worried because more penguins are going to want to eat them. If they all run out what happens to the penguins? Or the seals that like eating krill too? It gets complicated pretty quickly, and those complicated reactions are just what scientists want to predict.
Figuring out how an ecosystem works is all about rules and structure. Ecosystems are huge complicated things, but they’re not random – whether they work or not depends on having the right organisms doing jobs in the right places, and on having the right connections between all the different parts. It’s like a computer program that way. Weirdly, it’s also a bit like language. In fact, Anje’s background is in studying linguistics, not ecology. Think of an ecosystem like a sentence – there are thousands of words in the English language but in order to make a sentence you have to put them together in the right way. If you don’t have the right grammar your sentence just won’t make sense, and if an ecosystem doesn’t have the right structure it’ll collapse. Anje says that’s what she wants to discover in the ecosystems she studies. ‘I’m interested in the grammar of it, in the grammar of nature.’
Since models can help you predict how an ecosystem reacts to strange conditions, Anje’s work could help Antarctica survive climate change. ‘The first thing is to understand how the models work, how the models behave, and then translate that back to the biology that it’s based on,’ she explains. ‘Then say OK, this means we expect there may be vulnerable areas or vulnerable climate regions where you can expect something to happen if you take the model seriously.’ If scientists like Anje can figure out how Antarctica’s ecosystems are set up to work, they’ll get clues about which areas of the continent are most at risk and what they can do to protect them.
Surviving on a continent where the temperature hardly ever gets above freezing is tough, and climate change is probably going to make it even tougher. If we can figure out how Antarctic ecosystems work, though, we’ll know what the essential elements for survival are, and we’ll have clues about how to make things better. Extracting the secret grammar of survival isn’t going to be a simple job, but that’s no surprise to the people working on it. After all, they’re not called complexity scientists for nothing.
This article was originally published on CS4FN and can also be found on pages 10-11 of CS4FN Issue 9, Programmed to Save the World, which you can download as a PDF. All of our free material can be downloaded here: https://cs4fndownloads.wordpress.com/
This blog is funded through EPSRC grant EP/W033615/1.