Can a machine learn from its mistakes, until it plays a game perfectly, just by following rules? Donald Michie worked out a way in the 1960s. He made a machine out of matchboxes and beads called MENACE that did just that. Our version plays the game Ladder and is made of cups and sweets. Punish the machine when it loses by eating its sweets!
Let’s play the game, Ladder. It is played on a board like a ladder with a single piece (an X) placed on the bottom rung of the ladder. Players take it in turns to make a move, either 1, 2 or 3 places up the ladder. You win if you move the piece to the top of the ladder, so reach the target. We will play on a ladder with 10 rungs as on the right (but you can play on larger ladders).
To make the learning machine, you need 9 plastic cups and lots of wrapped sweets coloured red, green and purple. Spread out the sheets showing the possible board positions (see below) and place a cup on each. Put coloured sweets in each cup to match the arrows: for most positions there are red, green and purple arrows, so you put a red, green and purple sweet in those cups. Once all cups have sweets matching the arrows, your machine is ready to play (and learn).
The machine plays first. Each cup sits on a possible board position that your machine could end up in. Find the cup that matches the board position the game is in when it is its go. Shut your eyes and take a sweet at random from that cup, placing it next to the cup. Make the move indicated by the arrow of that colour. Then the machine’s human opponent makes a move. Once they have moved the machine plays in the same way again, finding the position and taking a sweet to decide its move. Keep playing alternately like this until someone wins. If the machine ends up in a position with no sweets in that cup, then it resigns.
The 9 board positions with arrows showing possible moves. Place a cup on each board position with sweets corresponding to the arrows. Image by Paul Curzon
If the machine loses, then eat the sweet corresponding to the last move it made. It will never make that mistake again! Win or lose, put all the other sweets back.
The initial cup for board position 8, with a red and purple sweet. Image by Paul Curzon
Now, play lots of games like that, punishing the machine by eating the sweet of its last move each time it loses. The machine will play badly at first. It’s just making moves at random. The more it loses, the more sweets (losing moves) you eat, so the better it gets. Eventually, it will play perfectly. No one told it how to win – it learnt from its mistakes because you ate its sweets! Gradually the sweets left encode rules of how to win.
Try slightly different rules. At the moment we just punish bad moves. You could reward all the moves that led to it by adding another sweet of the same colour too. Now the machine will be more likely to make those moves again. What other variations of rewards and punishments could you try?
Why not write a program that learns in the same way – but using data values in arrays to represent moves instead of sweets. Not so yummy!
– Paul Curzon, Queen Mary University of London
Subscribe to be notified whenever we publish a new post to the CS4FN blog.
This page is funded by EPSRC on research agreement EP/W033615/1.
In a recent episode of Dr Who, The Well, Deaf actress Rose Ayling-Ellis plays a Deaf character Aliss. Aliss is a survivor of some, at first unknown, disaster that has befallen a mining colony 500,000 years in the future. The Doctor and current companion Belinda arrive with troopers. Discovering Aliss is deaf they communicate with her using a nifty futuristic gadget of the troopers that picks up everything they say and converts it into text as they speak, projected in front of them. That allows her to read what they say as they speak.
Such a gadget is not so futuristic actually (other than in a group of troopers carrying them). Dictation programs have existed for a long time and now, with faster computers and modern natural language processing techniques, they can convert speech to text in real time from a variety of speakers without lots of personal training (though they still do make mistakes). Holographic displays also exist, though such a portable one as the troopers had is still a stretch. An alternative that definitely exists is that augmented reality glasses specifically designed for the deaf could be worn (though are still expensive). A deaf or hard of hearing person who owns a pair can read what is spoken through their glasses in real time as a person speaks to them, with the computing power provided by their smart phone, for example. It could also be displayed so that it appeared to be out in the world (not on the lenses), as though it were appearing next to the person speaking. The effect would be pretty much the same as in the programme, but without the troopers having had to bring gadgets of their own, just Aliss wearing glasses.
Aliss (and Rose) used British Sign Language of course, and she and the Doctor were communicating directly using it, so one might have hoped that by 500, 000 years in the future someone might have had the idea of projecting sign language rather than text. After all, British SIgn Language it is a language in its own right that has a different grammatical structure to English. It is therefore likely that it would be easier for a native BSL speaker to see sign language rather than read text in English.
Some Deaf people might also object to glasses that translate into English because it undermines their first language and so culture. However, ones that translated into sign language can do the opposite and reinforce sign language, helping people learn the language by being immersed in it (whether deaf or not). Services like this do in fact already exist, connecting Deaf people to expert Sign language interpreters who see and hear what they do, and translate for them – whether through glasses or laptops .
Of course all the above so far is about allowing Deaf people (like Aliss) fit into a non-deaf world (like that of the Troopers) allowing her to understand them. The same technology could also be used to allow everyone else fit into a Deaf world. Aliss’s signing could have been turned into text for the troopers in the same way. Similarly, augmented reality glasses, connected to a computer vision system, could translate sign language into English allowing non-deaf people wearing glasses to understand people who are signing..
So its not just Deaf people who should be wearing sign language translation glasses. Perhaps one day we all will. Then we would be able to understand (and over time hopefully learn) sign language and actively support the culture of Deaf people ourselves, rather than just making them adapt to us.
This week (5-11th May) is Deaf Awareness Week, an opportunity to celebrate d/Deaf* people, communities, and culture, and to advocate for equal access to communication and services for the d/Deaf and hard of hearing. A recent step forward is that sign language has started appearing on railway stations.
*”deaf” with a lower-case “d” refers to audiological experience of deafness, or those who might have become deafened or hard of hearing in later life, so might identify closer to the hearing community. “Deaf” with an upper-case “D” refers to the cultural experience of deafness, or those who might have been born Deaf and therefore identify with the Deaf community. This is similar to how people might describe themselves as “having a disability” versus “being disabled”.
If you’re like me and travel by train a lot (long time CS4FN readers will be aware of my love of railway timetabling), you may have seen these relatively new British Sign Language (BSL) screens at various railway stations.
They work by automatically converting train departure information into BSL by stitching together pre-recorded videos of BSL signs. Pretty cool stuff!
When I first saw these, though, there was one small thing that piqued my interest – if d/Deaf people can see the screen, why not just read the text? I was sure it wasn’t an oversight: Network Rail and train operators worked closely with d/Deaf charities and communities when designing the system: so being a researcher in training, I decided to look into it.
Image by Daniel Gill
It turns out that the answer has various lines of reasoning.
There’s been many years of research investigating reading comprehension for d/Deaf people compared to their hearing peers. A cohort of d/Deaf children, in a 2015 paper, had significantly weaker reading comprehension skills than both hearing children of the same chronological and reading age.
Although this gap does seem to close with age, some d/Deaf people may be far more comfortable and skilful using BSL to communicate and receive information. It should be emphasised that BSL is considered a separate language and is structured very differently to spoken and written English. As an example, take the statement:
“I’m on holiday next month.”
In BSL, you put the time first, followed by topic and then comment, so you’d end up with:
“next month – holiday – me”
As one could imagine, trying to read English (a second language for many d/Deaf people) with its wildly different sentence structure could be a challenge… especially as you’re rushing through the station looking for the correct platform for your train!
Sometimes, as computer scientists, we’re encouraged to remove redundancies and make our systems simpler and easy-to-use. But something that appears redundant to one person could be extremely useful to another – so as we go on to create tools and applications, we need to make sure that all target users are involved in the design process.
Many names stand out as pioneers of electronic music, combining computer science, electronics and music to create new and amazing sounds. Kraftwerk would top many people’s lists of the most influential bands and Jean-Michel Jarre must surely be up there. Giorgio Moroder returned to the limelight with Daft Punk, having previously invented electronic disco in producing Donna Summer’s “I feel love”. Will.i.am, La Roux or Goldfrapp might be on your playlist. One of the most influential creators of electronic music, a legend to those in the know, is barely known by comparison though: Delia Derbyshire.
Delia worked for the BBC radiophonic workshop, the department tasked with producing innovative music to go with the BBC’s innovative programming, and played a major part in its fame. She had originally tried to get a job at Decca records but was told they didn’t employ women in their recording studios (big loss for them!) In creating the sounds and soundscapes behind hundreds of tv and radio programmes, long before electronic music went mainstream, her ideas have influenced just about everyone in the field, whether they have heard of her or not.
The first person to realise that machines would one day be able to not just play music but also be able to compose it, was Victorian programmer, and Countess, Ada Lovelace.
So have you heard her work? Her most famous piece of music you will most definitely know. She created the original electronic version of the Dr Who theme long before pop stars were playing electronic music. Each individual note was created separately, by cutting, splicing, speeding up and slowing down recordings of things like a plucked string and white noise. So why didn’t you know of her? It’s time more people did.
For Star Wars Day (May 4th), here is a Star Wars inspired research from the archive…
Virtual reality can give users an experience that was previously only available a long time ago in a galaxy far, far away. Josh Holtrop, a graduate of Calvin College in the USA, constructed a Jedi training environment inspired by the scene from Star Wars in which Luke Skywalker goes up against a hovering droid that shoots laser beams at him. Fortunately, you don’t have to be blindfolded in the virtual reality version, like Luke was in the movie. All you need to wear over your eyes is a pair of virtual reality goggles with screens inside.
When you’re wearing the goggles, it’s as though you’re encased in a cylinder with rough metal walls. A bumpy metallic sphere floats in front of the glowing blade of your lightsaber – which in the real world is a toy version with a blue light and whooshy sound effects, though you see the realistic virtual version. The sphere in your goggles spins around, shooting yellow pellets of light toward you as it does. It’s up to you to bring your weapon around and deflect each menacing pulse away before it hits you. If you do, you get a point. If you don’t, your vision fills with yellow and you lose one of your ten lives.
Tracking movement with magnetism
It takes more than just some fancy goggles to make the Jedi trainer work, though. A computer tracks your movement in order to translate your position into the game. How does it know where you are? In their system, because the whole time you’re playing the game, you’re also wandering through a magnetic field. The field comes from a small box on the ceiling above you and stretches for about a metre and a half in all directions. Sixty times every second, sensors attached to the headset and lightsaber check their position in the magnetic field and send that information to the computer. As you move your head and your sabre the sensors relay their position, and the view in your goggles changes. What’s more, each of your eyes receives a slightly different view, just like in real life, creating the feeling of a 3D environment.
Once the sensors have gathered all the information, it’s up to the software to create and animate the virtual 3D world – from the big cylinder you’re standing in to the tiny spheres the droid shoots at you. It controls the behaviour of the droid, too, making it move semi-randomly and become a tougher opponent as you go through the levels. Most users seem to get the hang of it pretty quickly. “Most of them take about two minutes to get used to the environment. Once they start using it, they get better at the game. Everybody’s bad at it the first sixty seconds,” Josh says. “My mother actually has the highest score for a beginner.”
The atom smasher
Much as every Jedi apprentice needs to find a way to train, there are uses for Josh’s system beyond gaming too. Another student, Jess Vriesma, wrote a program for the system that he calls the “atom smasher”. Instead of a helmet and lightsaber, each sensor represents a virtual atom. If the user guides the two atoms together, a bond forms between them. Two new atoms then appear, which the user can then add to the existing structure. By doing this over and over, you can build virtual molecules. The ultimate aim of the researchers at Calvin College was to build a system that lets you ‘zoom in’ to the molecule to the point where you could actually walk round inside it.
The team also bought themselves a shiny new magnetic field generator, that lets them generate a field that’s almost nine metres across. That’s big enough for two scientists to walk round the same molecule together. Or, of course, two budding Jedi to spar against one another.
In January 2025 computer scientist Simon Peyton Jones gave an inspiring lecture at Darwin College Cambridge on “Bits with Soul” about the joy, beauty, and creativity of computer science … from simple ideas of data representation comes all of virtual reality.
Our universe is built from elementary particles: quarks, electrons and the like. Out of quarks come protons and neutrons. Put those together with electrons in different ways to get different atoms. From atoms are built molecules, and from there on come ever more complexity including the amazing reality of planets and suns, humans, trees, mushrooms and more. From small things ever more complex things are built and ultimately all of creation.
The virtual world of our creation is made of bits combined using binary, but what are bits, and what is binary? Here is a puzzle that Simon Peyton Jones was set by his teacher as a child to solve, to help him think about it. Once you have worked it out then think about how things might be built from bits: numbers, letters, words, novels, sounds, music, images, videos, banking systems, game worlds … and now artificial intelligences?
A bank cashier has a difficult customer. They always arrive in a rush wanting some amount of money, always up to £1000 in whole pounds, but a different amount from day to day. They want it instantly and are always angry at the wait while it is counted out. The cashier hatches a plan. She will have ready each day a set of envelopes that will each contain a different amount of money. By giving the customer the right set of envelope(s) she will be able to hand over the amount asked for immediately. Her first thought had been to have one envelope with £1 in, one envelope with £2 in, one with £3 and so on up to an envelope with £1000 in. However, that takes 1000 envelopes. That’s no good. With a little thought though she realised she could do it with only 10 envelopes if she puts the right amount of money in each. How much does she put in each of the 10 envelopes that allows her to give the customer whatever amount they ask for just by handing over a set of those envelopes?
Computer Scientists and digital artists are behind the fabulous special effects and computer generated imagery we see in today’s movies, but for a bit of fun, in this series, we look at how movie plots could change if they involved Computer Scientists. Here we look at an alternative version of the film Brassed Off.
***SPOILER ALERT***
Brassed Off, starring Pete Postlethwaite, Tara Fitzgerald and Ewan McGregor, is set at a time when the UK coal and steel industries were being closed down with terrible effects on local communities across the North of England and Wales. It tells the story of the closing of the fictional Grimley Pit (based on the real mining village of Grimethorpe), from the point of view of the members of the colliery brass band and their families. The whole village relies on the pit for their livelihoods.
Danny, the band’s conductor is passionate about the band and wants to keep it going, even if the pit closes. Many of the other band members are totally despondent and just want to take the money that is on offer if they agree to the closure without a fight. They feel they have no future, and have given up hope over both the pit and the band (why have a colliery band if there is no colliery?)
Gloria, a company manager who grew up in the village arrives, conducting a feasibility study for the company to determine if the pit is profitable or not as justification for keeping it open or closing it down. A wonderful musician, she joins the band but doesn’t tell them that she is now management (including not telling her childhood boyfriend, and band member, Andy).
The story follows the battle to keep the pit open, and the effects on the community if it closes, through the eyes of the band members as they take part in a likely final ever brass band competition…
Brassed Off: with computer science
In our computer science film future version, the pit is still closing and Gloria is still management, but with a Computer Science PhD in digital music, she has built a flugelhorn playing robot with a creative AI brain. It can not only play brass band instruments but arrange and compose too. On arriving at Grimley she asks if her robot can join the band. Initially, every one is against the idea, but on hearing how good it is, and how it will help them do well in the national brass band competition they relent. The band, with robot, go all the way to the finals and ultimately win…
The pit, however, closes and there are no jobs, at all, not even low quality work in local supermarkets (automatic tills and robot shelf-stackers have replaced humans) or call centres (now replaced by chatbots). Gloria also loses her job due to a shake-out of middle managers as the AIs take over the knowledge economy jobs. Luckily, she is ok, as with university friends, she starts a company building robot musicians which is an amazing success. The band never make the finals again as bands full of Gloria’s flugelhorn and cornet playing robots take over (also taking the last of the band’s self-esteem). In future years, all the brass bands in the competition are robot bands as with all the pits closing the communities around them collapse. The world’s last ever flugelhorn player is a robot. Gloria and Andy never do get to kiss…
In real life…
Could a robot play a musical instrument? One existed centuries before the computer age. In 1737 Jacques de Vaucanson revealed his flute playing automaton to the public. A small human height figure, it played a real flute, that could be replaced to prove the machine could really play a real instrument. Robots have played various instruments, including drums and a cello playing robot that played with an orchestra in Malmo. While robot orchestras and bands are likely, it seems less likely that humans would stop playing as a result.
Can an AI compose music? Victorian, Ada Lovelace predicted they one day would, a century before the first computer was ever built. She realised that this would be the case just from thinking about the machines that Charles Babbage was trying to build. Her prediction eventually came true. Now of course, generative AI is being used to compose music, and can do so in any style, whether classical or pop. How good, or creative, it is may be debated but it won’t be long before they have super-human music composition powers.
So, a flugelhorn playing robot, that also composes music, is not a pipe dream!
What about the social costs that are the real theme of the film though? When the UK pits and steelworks closed whole communities were destroyed with great, and long lasting, social cost. It was all well and good for politicians to say there are new jobs being created by the new service and knowledge economy, but that was no help when no thought or money had actually been put in to helping communities make the transition. “Get on your bike” was their famous, if ineffective, solution. For example, if the new jobs were to be in technology as suggested then massive technology training programmes for those put out of work were needed, along with financial support in the meantime. Instead, whole communities were effectively left to rot and inequality increased massively. Areas in the North of England and Wales that had been the backbone of the UK economy, still haven’t really recovered 40 years later.
Are we about to make the same mistakes again? We are certainly arriving at a similar point, but now it is those knowledge economy jobs that were supposed to be the saviours 40 years ago that are under threat from AI. There may well be new jobs as old ones disappear…but even if they do will the people who lose their jobs be in a position to take the new ones, or are we heading towards a whole new lost generation. As back then, without serious planning and support, including successful efforts to reduce inequality in society, the changes coming could again cause devastation, this time much more widespread. As it stands technology is increasing, not decreasing, inequality. We need to start now, including coming up with a new economic model of how the world will work that actively reduces inequality in society. Many science fiction writers have written of utopian futures where people only work for fun (eg Arthur C Clarke’s classic “Childhood’s End” is one I’m reading at the moment), but that only happens if wealth is not sucked up by the lucky few. (In “Childhood’s End” it takes alien invaders to force out inequality.)
We can avoid a dystopian future, but only if we try…really hard.
This article is inspired by the Kilburn Lecture given by Professor Steve Furber of the University of Manchester on 20th June 2008.
Can you imagine designing the world’s road network from scratch? Plus all the pavements, footpaths, bridges and shortcuts? Can you imagine designing a computer with the complexity of a planet?
In Douglas Adams’ classic The Hitchhiker’s Guide to the Galaxy, there’s a whole planet devoted to designing other planets, and the Earth was one of their creations. In the story, Earth isn’t just a planet: it’s also the most powerful and most complicated computer ever made, and its job was to help explain the answer to the meaning of life. Aliens had to design every last bit of it – one character, Slartibartfast, had the particularly complex job of designing the world’s coastlines. His favourite thing to make was fjords, because he liked the decorative look they gave to a country. (He even won an award for designing Norway.)
That’s just a story though, right? Could anyone ever design a computer of planetary complexity from scratch? As it happens that is exactly the task facing modern computer chip designers.
It is often said that modern chips are the most complex things humans have ever created, and if you imagine starting to design a whole planet’s road network, you will start to get the idea of what that means. The task is rather similar.
Essentially a computer chip is made up of millions of transistors: tiny elements that control how electrons flow round a circuit. A microscopic view of a chip like the one above looks very much like a road network with tracks connecting the transistors, which are a bit like junctions. Teams of chip designers have to design where the transistors go and how they are connected. The electrons flowing are a little like cars moving around the road network.
There’s an extra complication on a chip though. Designers of a road network only have to make sure people can get from A to B. In a computer, the changing voltages caused by the electrons as they move around is how data both gets from one part of the chip to another. Data also get switched around and transformed as calculations are performed at different points in the circuit. That means chip designers have to think about more than just connecting known places together. They have to make sure that as the electrons flow around, the data they represent still makes sense and computes the right answers. That’s how the whole thing is capable of doing something useful – like play music, give travel directions or control a computer game. It’s like designing a planetary road network, except all the traffic has to mean something in the end! Just like the fictional version of the Earth, only in fact.
It’s actually even harder for chip designers. Nowadays the connections they have to design are smaller than the wavelength of light. All that complexity has to fit, not on something as big as a planet, but crammed on a slab of silicon the size of your fingernail! Pretty impressive, but Earth’s intricate fjords are still more beautiful (especially the ones in Norway).
– Paul Curzon, Queen Mary University of London (from the archive)
Computer Scientists and digital artists are behind the fabulous special effects and computer generated imagery we see in today’s movies, but for a bit of fun, in this series, we look at how movie plots could change if they involved Computer Scientists. Here we look at an alternative version of the film Tsotsi.
***SPOILER ALERT***
The outstanding, and Oscar winning, film Tsotsi follows a week in the life of a ruthless Soweto township gang leader who calls himself Tsotsi (township slang for ‘thug’). Having clawed a feral existence together from childhood in extreme urban deprivation he has lost all compassion. After a violent car-jacking, he finds he has inadvertently kidnapped a baby. What follows, to the backing of raw “Kwaito” music, is his chance for redemption.
Introducing new technology does not always have just the effect you intended …
Tsotsi: with computer science
In our computer science film future version the baby is still accidentally kidnapped, but luckily the baby has wealthy parents, so wasn’t born in the township and was chipped with a rice-sized device injected under the skin at birth. It both contains identity data and can be tracked for life using GPS technology. The police are waiting as Tsotsi arrives back at the township having followed his progress walking across the scrubland with the baby.
Tsotsi doesn’t get a chance to form a bond with the baby, so doesn’t have a life-changing experience. There is no opportunity for redemption. Instead on release from jail he continues on his violent crime spree with no sense of humanity whatsoever.
In real life…
In 2004 there was a proposal in Japan that children would be tagged in the way luggage is. Now it is a totally standard way of tracking goods as they are moved around warehouses, and as a way to detect goods being shoplifted too. After all if it is sensible to keep track of your suitcase in case it is lost, why wouldn’t you for your even more important child. Fear of a child going missing is one of the biggest nightmares of being a parent. Take your eyes off a toddler for a few seconds in a shop and they could be gone. Such proposals repeatedly surface and
various similar proposals have been suggested ever since. In 2010, for example, nursery school kids in Richmond California were for a while required to wear jumpers containing RFID tags, supposedly to protect them. By placing sensors in appropriate places the children’s movements could be tracked so if they left school they could quickly be found.
Of course, pet cats and dogs are often chipped with tags under their skin. So it has also been suggested that children be tagged in a similar way. Then they couldn’t remove whatever clothing contained the tag and disappear. Someone who had kidnapped them would of course cut it out as, for example, Aaron Cross in the Bourne Legacy has to do at one point. Not what you want to happen to your child!
In general, there is an outcry and such proposals are dropped. As it was pointed out at the time of the California version, an RFID tag is not actually a very secure solution, for example. There have been lots and lots of demonstrations of how such systems can be cracked (even at a distance). For example, the RFID tags used in US passports was cracked so that the passports could be copied at a distance. And if the system can be cracked, then bad actors can sit in a van outside a school, or follow them on a school trip and track those children. Not only does it undermine their privacy, it could put them in greater danger of the kind it was supposed to protect them from. Ahh, you might think, but if someone did kidnap a child then the chip would still show where they were! Except if they can be copied then a duplicate could be used to leave a virtual version of the child in the school where they should be.
Security and privacy matter, and cyber security solutions are NEVER as simple as they seem. There are so often unforseen consequences, and fixing one problem just opens up new ones. Utopias can sometimes be distopian.
– Paul Curzon, Queen Mary University of London (extended from the archive version)
Annie Easley. NASA, Public domain, via Wikimedia Commons
Annie Easley was a pioneer both as a computer programmer but also as a champion of women and minorities into computer science. She went from being a human computer doing calculations for the rocket scientists (in the days before computers were machines), to becoming a programmer whose programs were integral to many NASA projects. Here work has helped us explore the planets and beyond, to put satellites into space and help humans leave the Earth. She also contributed to early battery technology as well as the alternative energy sources we now need to transition away from oil and gas. Throughout her career, despite being repeatedly discriminated against herself as an African-american woman, she encouraged, supported and mentored others like her.
Annie was a maths graduate so when she saw that computers were needed by NACA, the predecessor of NASA, she jumped at the chance. At the time a computer was a human who did calculations, as no machine at that point had been created to take over the job. She was one of only four African-american employees out of several thousand. Her job was to do the calculations researchers needed for their work. However, as digital computers started to be introduced – machines were now able to do large numbers of tedious calculations much more quickly than humans so took over the job…but now needed people who could program them for each task. To do so still needed mathematical ability to understand the task, as well as the ability to write code. She learnt both low level assembly language and the high level language, Fortran, invented for such scientific programming work and transitioned to being a programmer mathematician.
Much of her work involved or supported simulation, so writing programs that model aspects of the real world to test whether scientists predictions are correct, or to help make new predictions. Ultimately, this work would help provide the data to make choices of which technologies to use. Today computer simulation is a completely standard way of doing both engineering and science and has actually provided a completely new way to do science complementing theory and experiment. It allows us to probe everyday science questions but also big questions like exploring the origins of the universe or probing the long term consequences of our actions on the climate. Back then it was totally novel though, as computers were completely new. She was involved in simulation work that prefigured important work today around the environment, investigating systems to convert energy between different forms and so hybrid battery technology. It allows vehicles (whether a rocket, satellite, car or planetary rover) to switch between electric power and other sources of energy – an idea that has provided an important bridge from petrol to electric cars. She was also part of teams exploring alternative fuel sources like wind power and solar power (important of course now in space for satellites and planetary rovers, as well as a fossil fuel alternatives on Earth).
An Atlas rocket with centaur final stage. NASA, Public domain, via Wikimedia Commons
One of her major areas of work, that has had a lasting impact, was on the Centaur rocket. Rocket launches involve multiple fuel tanks to get the payload (eg a satellite) into space. The tanks of each stage are ejected when their fuel runs out with the next stage taking over. Centaur was the final upper stage which used the then novel fuel of liquid hydrogen and liquid oxygen to propel the payload in the final step into space. Centaur became a mainstay for satellite launches as well as for probes sent to visit other planets – like Voyager (which visited the outer planets and is now in interstellar space heading away from the solar system having visited ) and Cassini–Huygens (which sent back stunning images of Saturn’s rings). Newer versions of Centaur are still used today,
At the same time as doing all this work she was also heavily involved in NASAs public engagement with science programmes, visiting schools and giving talks about the work, inspiring girls and those from ethnic minorities that STEM careers were for them. She also worked as equal employment opportunity counselor. This involved her helping sort out discrimination complaints (whether over age or race or gender) in a positive and cooperative way.
Space travel has opened up not only a new ability to explore our solar system, but made lots of other technologies from SatNav to remote monitoring possible as well has helped in the development of other technology such as battery technology and alternative energy sources. We all owe a lot to the pioneers like Annie Easley, and none more so than the private companies now aiming to further commercialise space.