The latest BCS CIO networking event, in February 2024, discussed what emerging roles need to be considered in the short and long term and the impact of this on the talent pipeline.
Run under Chatham House rules, these events aim to give CIOs an independent space to network and discuss key issues and topics.
The introductory comments set the scene, acknowledging the impact of tech on people and society in general whilst also championing BCS as a place to convene thinking on these issues. This includes the vital professionalism focus – recently highlighted once again by the heightened coverage of the long-running Post Office scandal. In that case, the legal presumption on computer evidence is wrong – and that has been highlighted by the Institute.
So, what did our invited panel and guests think about the talent pipeline?
What areas are causing CIOs issues in the talent pipeline?
Understandably the conversations on the talent pipeline quickly turned to AI. Do CIOs understand the implication of AI skills or lack thereof? It was suggested that many senior leaders don’t seem to have a strategy for AI at the moment.
Now that aptitudes, skills, and work have all changed, more focus is needed on things that will add value to a particular activity. So, asking the right questions becomes more important. It was noted that both the public and private sectors are increasingly looking for tech talent with interpersonal skills.
Organisations need talent management and career structure to support the business and retain talent. Forbes recently reported that 70% of employers want creative thinking in 2024, so this requires a shift in HR requirements.
What skills are missing? An interesting example was used: when the lift in the office breaks down, we don’t get told the business impact, and when budgets are cut the same happens because the needed change is understood. However, tech changes so fast that many leadership teams don’t get it, and don’t even necessarily understand their own tech leaders – the potential impacts are not nearly as clear. The gap is around the ability of IT to engage with the business about the impact of new technologies.
Success is judged against dealing with the pace of change in companies that have strong tech roots. So, because it is so complicated, technical teams either need to be trusted - or the business needs to invest in ways to help their technical staff understand the business and vice-versa to build trust.
Any shared mission of an organisation needs to recognise the skills that IT brings. Another example: business analysis tries to solve problems the business doesn’t necessarily know it has. Likewise, IT solves problems people don’t understand in the first place. But of course each specific department knows more about their skills than IT does so tech people need to bring staff along with them on the transformational journey, not be combative about IT’s contribution.
In building or finance, outcomes are easily understood, but there isn’t a shared understanding of what healthy tech looks like. So, we need to communicate what the concepts are that can help non-technical staff understand the value IT delivers.
In the area of trust, reputation is ever more important. For IT staff, that demands the likes of online badges, memberships, professional registrations and verifiable credentials – IT people need to own their own identity and cross reference their credentials with awarding organisations.
Emerging tech presents a problem. How can IT self-regulate its own currency? What makes people stay relevant if there is no registering or chartering requirement? The panel discussed the currency of skills and the role that continuing professional development (CPD) plays in this. The number of emerging technologies and the speed of their emergence demand that CPD stay at the forefront of the business.
The age-old issue of CIO representation on the board also came up. One comment drew attention to the fact that a technical person on the board, no matter how highly qualified, is still likely to be asked about a malfunctioning laptop! This attitude demonstrates a lack of understanding of the impact and capacity of the IT team above and beyond basic fixing of IT hardware.
The conversation moved on to generational issues. Recent research was cited showing that Gen Z people are less comfortable socialising or being in an office environment and more independent in thinking and behaviour. The idea was put forward that a business’ need increasingly requires a blending of arts, people skills, and technology skills and we will lose something if we don’t understand the value of critical thinking and the role of rhetoric. The skills younger people need are in empathy and critical evaluation – especially in our world of toxic online discourse (with or without the advent of generative AI!). Remote working combined with Gen Z’s rapid role changing may be a challenge. Businesses need to understand the triggers and motivators of Gen Z to give them fulfilling careers.
Could it be that we need to re-contextualise how we view new technologies such as AI? For example, in the AI sphere we could ask how we coexist with another form of intelligence we may never fully understand. It was suggested we’re not supposed to understand AI, it's an intelligence, and this is really no different to our interactions with other people anyway – we don’t understand them either.
Other issues also appear in the application of new tech. It was mentioned that in AI, edge cases are not an excuse - 90% accuracy in AI is not sufficient. It was noted that we need to be clear on the difference between AI and machine learning (ML). A shorthand could be that AI doesn’t know how it gets an answer, but ML does. This re-contextualising could include the idea of learning to trust our AI colleagues – to view AI as a co-pilot who has certain aptitudes, a certain level of skill, and is hard-working, but not always susceptible to explanation.
Transformation projects also came up as an area for concern, with Harvard Business Review noting that a large percentage of tech transformation change projects don’t deliver on their promises…this undermines trust. It also shows these projects should be viewed as value-based outcomes, not just tech transformations. A related point that was made was that a lot of IT spending goes on urgency – so if you mostly spend on urgency, then underlying things that need to happen – e.g. compliance - don’t or can’t happen. Honest conversations about compromises are needed to support the business.
In the longer term another issue also arises. If we struggle to recruit and retain early careers talent how will we get the senior talent? In larger talent pools, employee comings and goings - and their attendant capabilities - even themselves out. But as AI is a threat to junior roles – and not just in tech – it is also by extension a threat to senior roles.
How can we solve these challenges?
The currency of skills has become ever more vital - for the business and the individual. So, how can the workforce become more adaptable to these changes when the pace of change is increasing?
Ray Kurzweil’s 2005 dictum on production vs innovation was mentioned (innovations improve versus computer power by a multiple, not in an additive way. Independent innovations multiply each other's effect.)
A recent development is pivoting to generative AI and data use. It needs charismatic leadership to elucidate why is this happening – and to bring people along on the journey. Also, a lot of material for training is needed, both mandatory and self-guided. We need a motivated workforce, and here expectancy theory from the 1960s was mentioned – people are motivated by the immediate value to them - so we need to connect the skills needed in the workforce to the relevance to their role.
The culture shift is a challenge—we need people to be responsible for self-development. We need ‘T-shaped’ individuals and at least some people who understand the end-to-end development of tech. Specialisation means that people coming from education often don’t get the context. We can add value by contextualising skills and helping people see their role, the business view, and so on.
Tech staff do need some deep understanding of specifics, but still need to see the larger context. It seems recent additions to the workforce are less able to deal with incidents because they don’t understand the root cause. To retain staff, they need to see they can make a difference - being an effective part of an organisation where they are personally contributing.
Senior level knowledge is vital. It was commented that no senior exec would say they don’t know about people or money, but apparently it’s OK to say this about tech.
To use existing talent, job specs need to cite real-world skills - to get people with logical skills, not necessarily just tech. This includes looking at attitude – finding people that want to grow. Returners are a good potential source as they already have business acumen and life skills. A future view is also useful - helping existing staff see what their next role might be, and to have clarity on the next levels and how to get to them.
What are the emerging roles?
With the current impact of AI, two main skills already reflected in the SFIA framework are data science and machine learning. These skills are beginning to be seen in job roles. At the moment, machine learning is still often used in a scientific context, while data science is coming into large organisations. However, it is still seen as transactional - a skill you buy in.
There is a groundswell in roles where staff are using AI, for example in producing code, without their organisation having a high-level strategy in place. The danger is that the large number of algorithms in large language models (LLM) – there are millions - obscures the process and how to fix problems.
Again, business analysts (BA) were used as an example. What does a BA need to do to maintain relevance? Some BA work can be automated - LLMs can generate models and computer code. Copilot can help draft emails. User stories can be generated by AI. So BAs need to focus on less repetitive tasks.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
To support an AI project, a BA needs to change to collecting non-functional requirements, and will need underpinning knowledge to understand AI principles, cybersecurity, regulations, copyright, plagiarism and the like. To embrace and use this correctly, the BA needs to sit between business and IT - this can carry on, but now BAs can look beyond the project. They can become strategic business partners, helping organisations look at efficiency, risk management, innovation, even profitability.
Ethical AI demands humans in the loop - human decision-making skills – for example, assessing what part of AI generated elements are good, and what are useless.
Other roles are not necessarily emerging but evolving. Cybersecurity is still in high demand—AI is not the only gig in town. Data needs to be consolidated, integrated, and structured. Many organisations are still transitioning to the cloud, a process that provides the opportunity to structure data.
In the midterm - the next 5 to 10 years - tech will be a standards setter and provide guide rails to help manage emerging risks. So, for now, we need to celebrate the success of tech and continue driving adoption and engagement, exploiting and shaping innovation, and reducing the time it takes to get innovation into mainstream use.
Get involved
Vibrant, animated and practical, the CIO Network aims to provide leaders with a space to form ideas and ultimately shape tomorrow’s technology agenda.
Want to connect with like-minded CIOs and discuss your challenges and triumphs? Find out how you can join the next event by getting in touch with our team today. Please note the CIO Network is non-commercial.