Can you provide some background on yourself (college years to present)?
I started out as a pharmacology student at the University of Leeds. I was really interested in how drugs interacted with the body and the creative aspect of it - developing new cures for stuff. But I found the lab work was really quite repetitive and I wasn’t really very good at it. I was a bit fingers and thumbs and I wasn’t very good at following someone else’s procedure. I was quite good at coming up with my own processes, but when I had to follow other people’s procedures I was less adept at it.
During that time we had to use a lot of analysis software, which I found really frustrating and it didn’t work. My ex-boyfriend, who was doing a computer science degree, got fed up with me winging about the software and chucked his textbook at me and said ‘right, make one yourself, if it’s that bad!’ I did; I wrote my own analysis software for my own experiments in the lab. I could do all the maths so it was just a case of learning how to program, which I did - it was a hell of a lot easier than I thought, but because I’d never done it before I didn’t know that I could do it.
So I decided to take modules in computer science in the second and third year, which I did much better in than the pharmacology modules. I was in the top ten of the year for those modules that I took, which was really cool. I decided at the end of it that it was obvious that I should be doing computer science and not do pharmacology; hence I did a Masters in multidisciplinary informatics, which is a pain to write on every application form I have to do!
It roughly translates as funky applied computing - it’s the application of computing to lots of different types of stuff. You view computing and computer science as a tool and then get your teeth into what the domain actually requires and match the two up. So it’s not being a bit of a mongrel, where a biologist is a bit of a computer scientist or vice versa, it’s learning to marry the two disciplines up. You actually do bridge the two and I learned the skills to do that.
Then for my PhD I studied biologically inspired computing. That involved the development of artificial immune systems. I worked with practical immunologists at the University of the West of England, in Bristol, and would build some models of how a particular cell in the immune system works and go down and get them to verify, experimentally, if the model was valid or not.
If there were any questions, they would ask me, we’d discuss it until we got a good enough model, then I’d make an algorithm from that model and apply it to computer security; so using an artificial immune system to detect artificial pathogens, effectively. And that’s where the bulk of my research has been.
So you’re modelling the potential problems and also the potential solutions?
Yes, that’s right. At the same time, because that’s what you need to do to develop anything that complex.
How would you describe an artificial immune system?
There’s not just one artificial immune system - it’s not like a genetic algorithm - with an artificial neural network. There’s an archetypal one and then you have variations within that. Artificial immune systems aim to use metaphors of components of the actual immune system to perform computation, to solve problems in the engineering domain (it’s like a ‘double hot process’). However, the immune system is not just made up of a single immune cell.
An immune system is a complex collection of what they call heterogeneous cell types - it’s got loads of different cells with lots of different properties solving lots of different problems in parallel, which gives you protection against pathogens. So there’s not an artificial immune system because there’s not an immune cell. So, therefore, what artificial immune systems are is a collection of algorithms, each one picking out the different properties of computation inherent in the different types of cell, which collectively form the immune system.
So could systems like this help to cure certain diseases?
They are applied back to biology, but that’s just one of the applications that they are used for. They’re mostly used for applications that are ‘noisy, real-world apps that require error tolerance, require decision making and require adaptivity’ in their function.
So, basically, any kind of complex, real-world, 21st century problem is where they’re coming in. Data is getting bigger, it’s getting messy, it requires correlation, it requires fault tolerance in our systems, because the systems are getting too big to monitor, so that’s where these kinds of algorithms come in, not for solving little toy problems in optimisation anymore.
Is that where your dendritic cell algorithm comes in?
It’s one of them. It’s what we call the second generation.
Is that ‘dendritic’ in the old sense of the word - with different strands branching off?
Yes, it’s an immune cell, which has an up-regulated surface area when it sees damage. It up-regulates its surface area for lots of reasons. Dendritic cells, to put it nicely, are the policemen of the body; they run around the tissue collecting evidence regarding the health of that particular tissue. And they also hoover up, through a process called phagocytises, loads of protein molecules.
So they’re checking if everything is kosher and then ‘eating’ all this other stuff and, if they see enough damage, they mature and go to a lymph-node where all the other immune cells hang out. They communicate with them, telling them where the damage is and what proteins they found nearby the damaged area, as one of those proteins has to belong to something that actually caused the damage.
So how do you actually model for that - without getting too technical?
Well, you stay awake for many, many nights…! You spend a lot of time on a train visiting immunologists, you look in the lab, you poke these cells, you see what happens, you hypothesise various ‘what if’ scenarios. It’s a really iterative process; it’s not a case of ‘you finding out some information and then making a model from it’.
You start off with pen and paper, or as I did with bits of string, sellotape, bluetac and bits of coloured paper on a big sheet of A0 paper and then you put the different characters down and you make different components of your paper for different analysis and that’s how you generate the conceptual model, or that’s how I do it because I’m from the Blue Peter age!
Because you’ve got someone from the computer security domain trying to understand the usefulness of this model and you’ve got an immunologist sat next to you - how are you going to make these people talk together?
You can’t really do it by showing the biologist an algorithm and you can’t really do it by showing some experimental results to a computer security guy so you’ve got to resort to basics. Everyone’s kind of done the old Blue Peter thing. So that was basically my role, to get feedback from these very different sets of people at the same time. Because of my history I spoke both languages.
It must be useful for you coming from a biological background, but also understanding the IT side of things too…?
Oh yes. Communication is the key part of any interdisciplinary stuff; if you don’t have someone who speaks both languages it must be really, really difficult.
Tell us a bit about your work on emotion discrimination for the entertainment industry?
It’s basically the incorporation of bio-sensing technology into computer interfaces; that’s where the proper, serious facing research is.
And how is it progressing?
It’s really coming to the fore. So what we do is to put people into different scenarios and see if we can physiologically measure their emotional state. Whether we can actually determine emotion or not, I’m not entirely convinced, but we can certainly discriminate between fear and enjoyment.
I suppose in the case of fear you can measure aspects of physiological change by monitoring how much sweat they produce and so on…
It’s not quite that simple - I wish it was. If you look at the characteristics of the GSR (Galavanic Skin Response), you can tell what a person is thinking from that – or at least how stressed out a person is. It’s never that simple - you need more data. This is where the artificial immune systems come in because they can process multiple streams of complex data that occur asynchronously.
So you’ve got someone’s heart rate, you can stick a hat on them, (a bio-sensing kit, to measure their EEG and measure their brainwaves as well), which is pretty weird to look at, you can measure their breathing rate, you can measure how often they’re blinking, how often certain muscles are being activated - the smiley ones or the frowny ones, so you can tell the difference between that. Just looking at any one of those data streams in isolation you can’t tell what the person is doing - just one of those standard time series statistical techniques is insufficient.
However, you can also measure the productivity of people; you can measure workplace stress of people. Products like this are already on the market and can be given to people. For example, to help those with bipolar disorder, to give them the ability to monitor whether or not they’re about to have an upper or a downer, then they can interface with the computer, which can then give them a program to calm them down, to cheer them up or just to bring them back to a more appealing state.
I like to use it in entertainment because, well, it’s fun and it gets me out of the office! Seriously though, rollercoasters were used, initially, to collect data since people thought that it was a constrained extreme response, because you need extreme responses, not just to test out an idea.
So, for example, you can do sensory deprivation where someone just sits there in a room with ambient lighting, controlled temperatures, no windows, no sound, the room is soundproofed, and you can give them no stimulation and then you need something at the other end. But you need it to be reproducible - so just giving someone a shock isn’t really very good.
We didn’t like the idea of electroshock therapy in this case because that would affect the equipment. It’s all transmitted via Bluetooth and stuff, so it’s not cool. By sticking them on rollercoasters we thought it might generate a large enough effect for us to be able to see what’s happening. Plus it’s reproducible because rollercoasters are inherently temnistic, with the exception of one in the UK - Spinball Whizzer at Alton Towers, which is now called Sonic Spinball.
They’re recently repainted it. It’s a Maurer Sohne construction one, like the rollercoasters at Blackpool pleasure beach in the late 60s, with the sharp turns, but the rollercoaster carriages are on a rotating base so you’ve got four people in the carriage and, depending on the weight distribution, that will determine how many times carriages spin at the different corners, and it also depends on who you’ve got in with you and how wet the track is and how hot the ambient temperature - you can a different ride every time. By carefully monitoring people on these rides, watching if they’re about to be sick and so on, you could lock down the degrees of freedom, as long as it’s suitably easy to replicate.
Who’s funding this work?
The Engineering and Physical Sciences Research Council (EPSRC), primarily. They funded a feasibility study in order to monitor people on various different rollercoasters off the back of an initial event and now this work takes place under the blanket of Horizon Research and Digital Economy of which Nottingham is a hub.
I think we’ve got something ridiculous like £18 million over the next five years. So we can apply for money and researchers through that and because this strand was central to securing the grant, they’re generally quite open to suggestion.
So are you funded by grants or are you funded by the university as a lecturer?
I’m funded by the university as a lecturer, which is nice - it takes the pressure off.
How long have you been lecturing for?
I’ve been two years in post - I’ve just had my two year party.
What aspects of the job do you enjoy the most and what parts are more difficult?
That’s a difficult question, but I really enjoy finding new ways of doing stuff. For example, developing complex analysis tools and thinking about how complexity can actually be applied. I think that’s pretty awesome because it’s hard.
If you read some of the old complexity sites, which are full of abstract maths and physics, and then you look at something elegant like a bunch of starlings swarming off Aberystwyth pier and you think how amazing that is - we should really be able to fathom this activity using computational systems. I really enjoy thinking about that sort of stuff and then actually doing it.
Are your sponsors happy with the way things are going - do they set you goals to do?
You have deliverables and I generally deliver twice as many papers as I’m supposed to. I do enjoy writing - if I ever get bored of university I’ll have your job off you!
On your website it mentions danger theory. Can you explain danger theory?
Danger theory is the core theory of immunology from which we build our models, dating from a couple of years back. The classical theory of immunology is a bit like a lock and key scenario - in that the immune system is trained to respond to proteins which are defined as non-self; so you have this classical view of immunology which is the ‘self - non-self’ theory, which is very discriminatory.
This theory stipulates that the human immune system is tuned and developed to mount responses against any particles that are not belonging to self. The theory was that while you’re in embryonic development you sample enough proteins from your own body to build up a representative picture of self so the immune system is tuned by the deletion of anything that matched self to produce the detectors which would then go out into the body to detect anything that was non-self. And this theory stood for about 70 years as the central dogma of immunology.
But there are loads of problems with that theory because what is known as self changes over time and friendly bacteria in your gut are not part of self and yet co-exist there. Also, why do we get auto-immune diseases, which a lot of people have got, for example, rheumatoid arthritis, where the immune system attacks itself. So if it’s trained not to do that, why is it doing it? There are a lot of things the classical self - non-self theory can’t explain.
So in 1994 this rather wayward immunologist Polly Matzinger came along and said the immune system doesn’t respond to self and non-self, it responds to whether it’s in danger or not and the self - non-self thing is just another filter. So what it says is that you have an initial self - non-self path and then you have additional signals, which come potentially from things like bacterial sugars - so if you encounter these you pretty much can be sure it’s a bacterium, which is signal-based.
Then you’ve got the danger-based model, which says if this protein is around when there’s danger, then you should really respond to it because where there’s danger there’s often damage. So danger signals are generated when cells undergo necrosis as opposed to signals of tolerance that occur when cells undergo apoptosis, which is controlled cell death.
If you’ve got a greater level of necrosis as opposed to apoptosis, then the immune system should either take a closer look or sort it out and send the boys in really! It’s the dendritic cells that relay the information ‘danger’ back to the central processors of the immune system in the lymph nodes.
The immune system gets it wrong from time to time though…and detects danger where there isn’t danger?
Yes, or it’s marginal... or the threshold on the direct current (DC) was too small or you get the ‘bystander effect’, which is how they think MS happens - which is where a person gets an infection, too much cell protein is taken in, the damage is seen at the same time as an invader, so the cells are presenting the two proteins. Both sets of protein are then responded against by the immune system. However, one was from a minor infection and one is yourself, therefore the ‘danger’ persists and the immune system keeps attacking it.
So the danger theory is not saying immune systems are brilliant, it says immune systems are flawed but they’re much more sophisticated than the classical ‘you’re either in or you’re out’ sort of ‘bouncer with a blacklist’ approach to immunology.
It’s still very controversial - the receptor for the detection of danger signals was only found about 18 months ago, so that’s about 14 years after the theory was first proposed. That’s why we like to try and model it because there’s a lot of work now being done in that area. Initially it was to try and disprove this lady’s theory, but now it’s a case of ‘oh, she’s kind of right, we were all kind of right, so now we need the evidence to back it up’. It’s a really interesting time to be working on IAS because immunology is changing really rapidly.
Was this what you were talking to the Bristol Branch of BCS about recently?
No, I was talking about the rollercoaster stuff because that lends itself much better to public engagement. Public engagement is what I’m supposed to do - it’s my admin role within the School of Computer Science and I’m school liaison for our department so I do a lot of stuff for school kids, but I somehow got involved in a lot of women’s groups, which gave me the opportunity to give a lot of talks on it and people have met me through that.
They either ask me to talk about the role of women in computer science or they ask me to talk about the rollercoaster stuff. Because I’m trying to attract people into computer science or to say ‘look at the diverse array of applications, look at all the different things you can do with it’, I tend not to talk about the core research outside of academic circles.
I guess for a lot of people it goes over their heads fairly quickly…
It would take me half an hour just to put the background down. However, the majority of people, including the man in the pub, understand the basic concept of what we do, namely designing computer immune systems to help fight against computer viruses.
Do you think the IT profession’s reputation of being full of nerds and geeks is deserved and do you think anything can be done to improve this stereotypical image?
I think it was true initially, but it’s a very different world now. I’ve got this photo that I show in my slides, of Microsoft in 1979 and you should see the quality of the beards, they’re unbelievable! I saw it and thought, this is what people think we’re all like! But we’re really not. This is a difficult one - there’s no easy answer because there’s a number of factors.
So do you think this beardy, weirdy image still persists with young girls?
Yes. But I think it’s also a lack of confidence. If you watch boys and girls playing with computers, the boys will nearly always try and wrest the controls off them. I think most people still think that most of the people in IT still look like this (points to a photo full of blokes who look like George Lucas wearing sandals) - even the women look like they should have beards!
They’re like escapees from a hippy commune…
This is the fault of the industry attached to it. Obviously people have got a lot smarter and more professional looking.
You’re out there liaising with schools, which is obviously a good thing, but do you think the government should be doing more?
Yes, I think so, because computer science isn’t on the national curriculum. I do taster days for year 10 to year 13 kids, who come into our department and they learn some programming through Scratch, which is really visual, and graphic, nice and friendly and happy.
They know how to do it, (it’s applied problem solving), but show me a class in school which has that... maths is really prescriptive now, technology is really vague and IT lessons are all Excel spreadsheets and Access databases and if you’re lucky, you get to do a webpage. But that’s not problem solving, using algorithms or even thinking about how you’d go about solving a problem. It’s either very subtle or it’s just not there.
We’re starting to get more girls taking IT ‘A’ levels, but the general numbers of students taking up IT are going down. It’s looking pretty bleak at school level, so I think the government really needs to step in if they want us to push forward in this sector in this country. And we’re actually pretty damn good at it, to be honest. They’ve got to start introducing IT somewhere in the national curriculum.
Do you think what puts some girls off is that they don’t want to be the only girl in the class?
They feel intimidated, yes. It’s really intimidating. I had that in my electronics class at school. But when you get to university level, it’s an advantage; everyone knows you and you get treated a bit nicer because you’re the girl. People actually remember your name and you actually get the personal touch at university. That’s what I found anyway, although it might have been due to the fact that I had pink hair!
It does shock me though, when I talk to these kids and ask them if they like computing and they say ‘I like putting numbers into a spreadsheet’ and I say ‘really? I don’t!’ That’s not what computer science is about! The problem with that is we have quite a high drop-out rate as a result, because you get students turning up thinking it’s going to be that and they get a bit of a shock when they find out it’s quite maths-intensive and it’s quite a lot of programming, as it needs to be in order to be a software developer.
It’s the same with the gaming industry - people don’t seem to realise that it involves a lot of programming and maths... it’s not just about pretty pictures.
Going back to the issue of women in the IT industry - do you think the glass ceiling still exists?
Sadly, yes. I like to get out of bed and believe that it doesn’t, but I know it does.
Do you think that might change over the coming years?
Yes, I hope so, but it won’t if there are not enough girls coming up from school level. If there’s not enough mass of girls, if there’s not a loud enough voice, then it won’t change. Unfortunately, I think the time at which career women reach the upper levels it’s right about the same time as that their biological clock has to tick, because you’re kind of running out of time.
Many reach senior management level about 35. I personally have been too busy studying to do anything about that. But at the age of 35 you are faced with that choice - are you going to go into the boardroom or are you going to have a family? They say they’re increasing support for maternity leave and this, that and the other, but it’s all on paper.
I know from a personal perspective you’ve got to make a choice. So that’s possibly why there’s not so many female professors or so many women in the boardroom because, if you do go down the traditional career path, your chance to get into the boardroom will often coincide with that last chance to start a family.
If you leave it any later, it gets more difficult - maybe the best way is to start a family at 17 and then get a good run at a career later on in life? I don’t know, that’s just one hypothesis. Another one is that women just don’t ask for promotion as much, they just don’t go knocking on the boss’s door as heavily as men do.
What are your thoughts on professionalism in IT - do you think the government should get more involved or do you think industry should regulate itself?
I’m averse to doing more exams - I think there are too many exams as it is! I think it should be experience-based - I think there should be some sort of points system where you get points for completing a really good project.
I think the industry needs to self regulate, whereby the industry awards some sort of standards to completed projects - a bit like LinkedIn lets you recommend the work of other people - some sort of network like that, which enables you to recommend other people, would be far more useful than someone passing an exam.
I know some of the students that I’ve got can pass a programming exam, but can’t programme for toffee! I know some of my doctor friends have to keep studying in order to become registrars, but I think, can’t the assessors just follow you around and see how you are with patients and see if you’re actually good enough. I think that would be a better approach. I know that lawyers and doctors have to do more exams to get chartered, but I don’t necessarily feel that that’s the right way.
I think there are some things, especially in IT, where it just come down to practical problem solving, which you can’t really solve by taking exams, if you know what I mean?
I think it’s a good idea to regulate - I think there are a lot of shysters out there - a lot of money for old rope, but at the same time I wouldn’t like to see a situation where you had to pass one exam to be able to do this and another to perform another task. I just think that’s a nice money-making enterprise for someone to run those courses.
Who were your role models that inspired you to get involved in IT?
I was quite gender-blind until quite recently - so it wasn’t really a case of female role models. I was so stubborn about it, I didn’t think my gender would mean anything. I just thought I don’t need a female role model. I think, to be honest, I could have done with one looking back on it. But I’m trying to engage a bit better now.
So are there any male role models you aspire to be like?
Yes, there’s a professor in Bristol, Dave Cliff, who’s awesome. He hired me for Hewlett Packard and I worked with him for nine months. He was amazing - he just can’t sit still, it’s unbelievable. I like the way in which he can get on and do some really good stuff, but the way in which he can present it as well and the way in which he interacts with people and comes up with really original stuff and keeps coming out with more original stuff, and has a family, and goes running and all the rest of the stuff he does, it’s really amazing.
Does he sleep?
I’m not convinced of it! I’ve seen him on Facebook at 2am, so I’m not sure.
What would you say have been the most exciting, most groundbreaking, changes in the IT industry over the last five or six years?
Mobile computing. It’s completely changed society. It’s introduced people to computing in a way in which they’re not even aware that it’s computing. I think that’s awesome, because that’s a revolutionary thing to do. To get people involved in something they’re not even aware of and then it just becomes part of the language.
I mean my mum looks at my Twitter to see what I’m up to; if she can’t reach me on the phone, she looks for me on the internet. It’s so different from how it used to be. So I would say mobile computing and social networking via mobile computing have changed everything. I wasn’t expecting it to be that big. I mean when you first got a camera on your mobile you thought it was such a white elephant, but now it’s a staple ingredient. When the first Nokias came out with WAP on them, or something like that, and the internet took forever to open, we thought internet on phones would never work.
When I get into a school, as part of my talk about ‘computing is everywhere’, I get the kids to put their hands up if they’ve got a mobile and pretty much everyone puts their hand up. Then I say ‘keep your hand up if you’ve got a Smartphone’, and about a third keep their hands up, and these are 14-year-olds! There not like my lot, who you’d expect to have one - I don’t personally. So I think that is impacting the entire spectrum, that’s massive, and the development of apps for that has changed the business model for IT.
What are the biggest challenges in your own sphere of study that you see will need to be cracked over, say, the next decade?
Maintaining the security of these networks and devices is going to be a problem. Nobody has really gone to town so far on mobile or cloud security and this is where I see these complex analysis tools coming in.
Quick questions
Open source or proprietary?
Open source all the way. It’s also bogus when people, (like me a few years’ ago), say that proprietary is the only way to make money out of software. It’s not true - a friend of mine has done really well out of open source.
Apple or PC?
Apple, because they work. When I go home I don’t have to be a system administrator, but yet I can still tinker using my XLM terminal. But I don’t like their aggressive marketing policy.
Wii, Xbox or Playstation?
I love my Wii - I’m currently losing weight just by doing Just Dance 2 thirty five minutes a day; it’s brilliant. The interface is so innovative and I know that Xbox Connect is out and iMove Playstation is out, but Wii were there first. The iMove isn’t so good. I’m getting a Connect because somebody’s hacked it so you can now plug it via the USB connection, into the Xbox and you can now write your own open source tools with it, which is really exciting.
Blackberry or Smartphone?
Neither, get a life! Talk to somebody. If you’re sat on the train try talking to the person sitting next to you. I once had an Android for two weeks that I borrowed and when I checked my Twitter on the toilet I realised I really shouldn’t have a Smartphone!
Geek or nerd?
Geek, but chic.
Do you think there’s a difference between a geek and a nerd?
Yes, there is. There’s a much better culture associated with geeks than there is with nerddom, which is more focused, I think. I think geeks are generally aware of their geekiness, whereas nerds have got a few too many personal issues!
If you were going to give a piece of advice to someone who was thinking about getting into IT, what would it be?
Just try it, you’ll get addicted. Programming and stuff is really addictive - there’s a lot of instant gratification. Don’t be afraid to just try it.