This week we are joined by Brian Murphy, Senior Director Employee Skilling at Microsoft, and Charles Jennings, Managing Director at Duntroon Consultants. Summarising a recent podcast about AI and the future of learning, this blog lays out the key takeaways from their discussion, giving examples of where there are great existing – and future – use cases. It also explores limitations and where organisations can fall prey to over-reliance.
What’s the history between Microsoft and Open AI?
Microsoft has been working with Open AI since 2019. There has been a lot of collaboration, particularly around this new era of AI, whereby Microsoft and Open AI have been developing supercomputers at a large scale. These models are then being used to inform the work with Open AI to a point where we will likely be seeing applications like ChatGPT across the Microsoft product suite.
AI is not new. AI has been around for a while and Microsoft has been using AI for just as long. However, it has mostly been used for powered search and social media. It’s been on autopilot so far, but now we’re moving into this co-pilot era.
What role will AI play for learning and development?
We are now seeing AI positioned as a key productivity tool. Humans learn mostly through experience and through connections. This new era of AI simply helps us continue to accelerate that way of learning. From an L&D perspective there’s going to be a lot of people saying this will transform what they do. In reality, we probably should have changed our practices even before the advent of AI.
We are still going to need solicitors because of the need for human checks and for that expertise to be laid over the top of these technologies. For the majority of us, it will change what we do and how we do it. However, we’ve got to be mindful about the limitations of AI.
In the next few years, we’re going to see some cases where human expertise is not overlaid onto AI outputs, creating a variety of potential problems – including accountability. Who is responsible if something goes wrong? We can’t have a technology holding accountability.
Do you need a certain level of expertise to use ChatGPT effectively?
AI technologies, particularly ChatGPT, are like a comparison between a new graduate and someone who’s been working in the field for years. For example, what is the difference between learning physics and being a physicist?
It’s that when you graduate, you have that knowledge embedded in your head, but it doesn’t make you a professional. What makes you a professional is being inculcated into the culture and understanding the nuances of the topic.
Experts can look at data and identify that it doesn’t quite look right, even though they may not be able put a finger on it. However, they’ve seen 10,000 data sets before and haven’t come across one that looked like this. A certain level of familiarity with the topic is necessary to know that the information presented may not be entirely accurate.
What can humans bring to the table that AI can’t?
AI is going to help with innovation, but whether they’re going to replace human invention is a separate issue because innovation and invention are two different things.
The human mind is capable of inventing something entirely new as well as building on what has come before. AI may allow us to spend more of our time on being creative. Creativity is an innately human trait because to be truly innovative you need to be curious; which AI can’t be.
What skills or capabilities are important for us to be successful coexisting with AI?
One capability would be learning how to learn, meaning acting with curiosity. Another is change adaptation and agile sense making. Lastly, collaboration and cooperation. In this context, collaboration means working together towards a common goal and cooperation is knowledge sharing with the purpose of making the collective more intelligent. These are all innately human capabilities.
Humans learn four different ways; learning through experiences, through rich and challenging experiences, learning through opportunities to practice and reinforce what we do. AI can help us in terms of providing a lot of the resources.
How can AI assist HR?
HR is focused on the individual; job roles, job role matrices, skills matrices and frameworks based around the individual. However, we exist in a world where very few people achieve their outcomes as individuals. We work together and AI can help us as team members.
AI can do a lot of the stuff which automation has been doing for years, even faster. This allows humans to utilise the power of AI to do the more mundane tasks, allowing people to focus on the higher cognitive level work. That said, we now need to step back and think about what those capabilities are that we need.
What is a skilled-based organisation and how can AI support its development?
A skills-based organisation is one which places a strong emphasis on the skills and abilities of its employees, rather than their job titles or positions. It is a complex but increasingly popular concept, as discussed in our latest talent forum. AI is a great solution to this problem.
There are three fundamental questions:
- What skill do we currently have? AI can help us get much closer to understanding what skills and capabilities we have in the organisation, within team and individual skills. It will also support navigation and open doors for career destinations as well.
- What skills do we need? Some talent marketplaces are already using AI to support skills-based career development, recommendations and coaching.
- How do we bridge the gap? How can we bridge skills gaps, particularly to move to mastery. It’s experiential and connected learning that matters and talent marketplaces are using skills, data and AI to personalise recommendations.
On the talent learning side, the value around skills intelligence will only work if we think about how work is done in organisations. There’s no point building the infrastructure if the business is still trapped in a backward-looking approach to work.