Chapters
00:00 Introduction
01:04 Managing Expectations: Aligning AI with Problem-Solving
09:22 The power of clear communication and employee involvement
14:09 Recognising and Balancing Different Roles in Managing Expectations
17:28 Harnessing Diversity of Thinking as a Superpower
21:07 Teamwork and Collaboration for Effective Expectation Management
23:04 Clarifying Expectations: Asking About Assumptions and Success
24:30 Starting Small: The Value of Proof of Concepts
26:24 Conclusion and Further Resources
Welcome everyone to the Humans and AI in the Workplace podcast. Over the last few years, it's become clear that artificial intelligence, AI, is one of the most impactful and disruptive transformations in the workplace. As a leader, you may be wondering how to get started and how to do it in an intelligent way. Or you may be stuck on how to overcome some of the people issues and human bottlenecks your AI has crashed into. We are here with Dr. Debra Panipucci and Leisa Hart to discuss today's topic of managing expectations around AI.
So let's talk about managing expectations. With all the hype around productivity gains that AI and intelligent technologies can bring, there's becoming more and more stories about the inevitable letdown once reality is set in. And in this episode, we'd like to share some practical actions for leaders to take to manage expectations, their own expectations and their people's expectations. Mostly we speak with leaders that are kind of bullish about AI's almost guaranteed delivery of productivity gains and it's not always grounded in a deep knowledge of the solution or with any depth of knowledge of the problem that they're trying to solve and is this the right technology fit?
And there's a special kind of hype associated with AI and new highs and expectations. Most reports talk about this largely around job losses due to the capability and efficiency gains of intelligent technologies and those natural productivity gains that will come with that. What's interesting in a recent survey released by the Upwork Research Institute where they surveyed two and a half thousand people across the US, UK, Australia and Canada 96 % of C -suite executives say that they expect the use of AI tools to increase their company's overall productivity levels. But only 26 % of them have AI training programs in place for their people. And only 13 % of them report a well -implemented AI strategy. So there's a bit in there to unpack for us in this episode today. And we'd like to share with our audience and leaders listening some of the practical ways that they can start to manage expectations and getting better outcomes as a result. Often the starting point that we go to is exploring whether there are leaders in the company that are operating independently in terms of they all have their own agendas, they all have their own business areas that they run quite independently and whether they are on the same in terms of the need for this and how this would look for them. Because often when AI is brought or intelligent technology is brought into a company, it's brought in by an expert, a chief technology officer or a data scientist or it's already in the software packages that the company is using in the organization. So it's not a common understanding among the leaders in terms of this is where we need to get to as an organization.
My team and my business area are part of this. So the role that we play and how this will help us as an area is X. And so if you're already on a misalignment at the beginning part, it's really hard to have clear expectations in terms of what it's going to deliver and when and how and how long that's going to take. So if you haven't surfaced that and brought that onto the table to get that aligned perspective, aligned expectations at the very start, then you're going to be challenged with disappointment later on, you could probably say, or frustration later on. So how do you make it coherent for the business? So what problems are you trying to solve? What use cases do you have? And are they the right ones to be solving? And have a really clear strategy for that roadmap, that pathway to innovate, to get some ideas do some ideation, maybe run some small proof of concepts and experiments. And you'll understand more about how people are engaging with that too, if you do that in a thoughtful way and be realistic about involving them so that they can help you and can inform. This is the right problem to be solving. And then to start to look at, well, if you've really landed on that as the right place to spend your time and your effort and the right problems to be solving.
Well then what are the solutions from a technical perspective? And don't just go, we want this AI, then what problems can we use the AI to solve? You need to kind of flip it around until people play an important role in helping you understand where those opportunities are and for the problems to be solved because that's in their day -to -day work. They'll be the things that don't work in their processes in the way that things happen from a customer experience point of view that they've got visibility of. There's so many aspects of this that your team could be informing if you're involving them early and in really thoughtful ways throughout that process. If you've got the problem, you think there's some technologies that could help solve that and comes into play that critical thinking component. Deb, so thinking critically about, well, again, what is the technology? Can it practically help solve this issue.
That's right Leisa and another point I'd add in there in relation to experiments and bringing AI in, because often we see AI come in in a pilot or as a technology that people are allowed to experiment in and to play around in as part of their work and looking at ways that they can improve their work. But to get the gains and the benefits out of it you have to be realistic around your culture. So if you don't have a culture that is open to making mistakes or using technology that may fail and you're investing a whole heap of money in that technology even though it's no guarantee that it's going to actually improve anything because it's the way that it's used and how it's used that actually is the benefit that you're trying to get to and that involves people. So without creating the cultural environment for experiments and trying new things and being able to put up your hand and say this isn't working for me let's try something else then you're going to have people ploughing on with technology or processes that aren't actually working for them but they don't want to put up their hand and say it's not working for them or you might have people who are too afraid to step into that space and try out the technology and look at ways it can improve the process because they're worried that it won't be successful so instead they sit on the edges and try and watch from a distance and they don't want to participate in a sense.
The other thing to add to that is people actually want change. They want better work. They want an opportunity to increase their skills, to evolve their skills over time. There was a recent study by PWC on the workforce hopes and fears and one of the questions was changes everywhere and thinking about changes you've experienced in your role and to what extent do you agree with positive feelings or neutral feelings or negative feelings about all of those changes. And 72 % of people had positive feelings towards the statement that I'm excited about opportunities to learn and grow in my role. My role therefore as a leader is to be thinking about that in the mix of managing expectations. You've got people who are super keen to continue to evolve their skills and what they're contributing to the business. And at the same time, you've got potentially other leaders having an excitement around what the technology can deliver in terms of productivity. So how do you marry those two together and bring them to a middle ground where there is a reasonable expectation from everyone about what the technology can do and what it can't do still.
We've worked in businesses a long time and I've never had anybody say no when asked if their systems and processes could be improved. They always say yes. There's always something that can be changed to be better, to make it easier for them. Everybody always says yes. They want the parts of their job that are less exciting and more mundane to be delivered in some other ways. That's about their headspace being used for things that could be delivered through automation or a co -pilot function or some piece of intelligent technology. So I don't think there's necessarily what's typically referred to as resistance. The number one thing that motivates people is progress. So that means progress in the work too, not just outside of work. It means progressing that piece of work or the way that you do something, the progress that's within that.
You know, if you're doing the same thing the same way you did years ago, one year ago, two years ago, five years ago, 10 years ago, God forbid, we've got that natural desire for evolution and improvement as human beings. So it's up to leaders though, to make sure that you're harnessing that in the right way for the right thing at the right time and managing expectations because you've been thoughtful about what you are trying to solve and is this the right technology to be solving it? The challenge for leaders is that employees do want improvement in their processes and their systems. And when they look to the market and the media in terms of AI and intelligent technology, it feels like they could go from zero to 100 by putting it in place. So it's one of those new areas that's there's not a lot known about it just yet because it isn't heavily involved. It isn't heavily embedded in businesses. And so it feels like a silver bullet. Yes, this technology can come in and solve all the problems that I've always had. And we know that that's never the case. is the baby steps, isn't it? And the media hype, it's how do you navigate through the media hype to not water down people's expectations and hopes, but put a realistic lens in terms of what can be achieved in the current state of your business, the current culture, the current processes, the current skills, and what do you hope to get to as a future vision? Yes, because what you might be hearing in the media is something that produces, let's go with a version of Copilot where some people are saying they get 70 % increase in productivity, which is epic, right? But that's not always going to be the case in every organisation because we know how much culture and leadership plays a role for individuals in their willingness to experiment with something in the first place, but then their ability to make sense of that is dependent on the environment that they're in. So you can't just look at something that's broadcasting a message and assume that you'll get that same amount of productivity in your team, in your business, because there are things that we'll get in the way depending on what cultural attributes and dynamics you have in your business.
Are they bottlenecks or are there things that will disrupt people's ability to lean into that technology in a way that other organisations have already solved for? There is that opportunity to manage your expectations based on, yes, there is an ability to get some uplift and advantage from intelligent technologies. And let's see what that's like in our organisation, in our environment, in the way that we've structured our technology, in the way that we've housed our data. All of those things will be context.It comes down to if I'm a leader and I'm looking to bring intelligent technology into the organisation, we have high hopes for what this could do. We've got a vision, there's a lot of areas that could be improved by bringing intelligent technology in. There's a step before you bring it in to look at your AI readiness and where your bottlenecks are, where your gaps are in terms of data processes, systems, infrastructure, knowledge, skills to be able to identify what can realistically be achieved with the intelligent technology right now versus in a year versus in two years, three years and planning it out in a really thoughtful way and then having clear communication and messages with employees who will be interacting with this technology so that they can have clear expectations but also the vision of the future as well because we all need to know where we're going and what good looks like but there's always a path to get there. In terms of expectations there are different roles that people will play when you start looking at bringing intelligent technology in your organisation. There will be the people who lean more towards a mindset of risk and they'll be thinking about all of the risks involved and putting all of the negative possible outcomes on the table, all the different scenarios of what could go wrong or things that need to be managed or planned for or workarounds that need to be in place. And then there'll be really important voices too, aren't they? But you need to recognise them, right? And you need to have them in the room and you need to hear them but you also need to make sure that they're balanced out with the Positive, enabling voices of, well this is how we do it, this is how we solve those problems, this is how we mitigate those risks, these are some of the things we could put in place, these are some of the steps we can do to learn and get information so that we can assess whether the assumptions are correct or are not valid. And also the bigger implications of this. So thinking about some of those voices that are enabling this, what are they looking at from a broader perspective. Okay, so we have the guardrails in place, but are we playing the long game? Have we got the right skill set or the right skills pathway in our businesses? Or how do we help people get the skills that they need? So you're looking at what's happening now, but you're also looking at to the future from that foundational capability building piece or cultural impact piece. And then you've got the super curious creative people that love to experiment and innovate and they'll just get super excited about something that they've heard about on a podcast this morning or they've read in the news and they want to come in and bring that in straight away. So you kind of got to balance all of that out in terms of the different roles that people will play and will want to play naturally because that's what they like and that's where they need a balance. And it's the different frames that they're looking through.
And people's expectations tend to be aligned to those frames. So it's being clear on where are people and how do we meet them where they're at and help them get on the same page in essence in terms of which really comes down to good prioritising, right? And bringing out all of those assumptions as part of your decision making process and getting that narrative together in terms of why are we doing this? What are the key risks we are trying to manage?
What are all the risks and issues that they're there, but they're not priority for us, that we're okay if they exist. So it's part of managing people's expectations and setting those, that realistic expectation of what? Gen. AI products and like co -pilots and other intelligent technologies and even automation and RPA, what can they do feasibly for your business really comes down to being able to have those people, those voices, and get them all aligned and on the same page so that everybody is have the same expectations or at least have realistic expectations. And that's a really simple thing that you can do at any point but preferably at the start of thinking about this is getting the right people in the room and agreeing those different lenses and those different expectations. But how do you manage that together and keep revisiting the conversation about has the risk profile changed? Okay, well, what do we expect now? What is that consensus thought across those different roles that you're expecting people to play? The other thing I'd say in people playing those different roles is to really lean into the power of that diversity of different thinking and use it as a superpower. Don't see it as something that is going to slow you down or block progress. If harnessed correctly, and if you create the forum to be having the conversations in a way that is collective thinking, then it's going to be a superpower and you get things raised in a way that others won't necessarily be bringing to the table, but could potentially be game changers in terms of mitigating a risk or engaging people around something in a different way. So I think really encourage people to think about how can you bring people together as soon as you can to align those expectations and understand the roles that different people are playing, but how it all comes together in the middle to create the picture of why you're doing this. And it's not easy, right? So one of the first things we learn in university in any psychology related class is that there's a similarity attraction bias that we want people, we like people and we like to work with people who are similar to ourselves and who think the same as us and have the same expectations and assumptions. So getting into a space where people are different and approaching things in different ways is uncomfortable and it's difficult and it's hard to see things from their perspective. So it's hard to see why they have certain expectations or why they're approaching something different to the way that you expect you are approaching it.
It's not an easy thing to do. No, but it's also a really powerful thing if you can harness that. Well, if this is not a habit in your organisation of running multi -disciplinary teams or different thinking styles and meetings, it will take some time to build that muscle. And as you said, that it's a comfortable process in our brain because the brain loves to be efficient. So of course we're going to gravitate to the people who think the same, feel the same about things predominantly or similarly.
So you're forcing more cognitive effort into thinking differently and asking people to think differently does take more resources. But ultimately, it's still going to be economically more beneficial than finding out that something's not working well later because those voices weren't in the room when that thinking wasn't considered. So diversity always lends a greater voice. We might not always like the answer or the provocation.
And sometimes it's hard depending on the skill of the person who's wanting to raise their concerns. They might not have the best way of doing it. But if you can look through that and go actually, there's something in here for us to think about and grapple with collectively, then that's where the real work happens. And there's so many examples where this hasn't happened in business, in those businesses that have failed and now are epic case studies. AI is going to put that on steroids, that pressure because it's moving fast and it's got the power to disrupt bigger than anything else we've had in business before. So now is the time to sit for a minute, look at how you're managing expectations for the short term and the long term and what are the mechanisms you've got in place to check in. Make sure you've got the right voices in the room. Really clear on the problem that you're solving and how you come together and continue to check in and grapple with that progress over time, making sure it's happening in the right way. And that's the only way to get the productivity out of intelligent technology, right? Yeah. mean, gone are the days where you can build something in isolation and just throw it over the fence and hope for the best. That's never worked well, but the consequences weren't as great with previous forms of technology. Now the consequences are significantly bigger for getting that wrong.
You've got data breaches, you've got all types of disruption to your culture, the loss of great talent in your business. Do you understand the technology, what the potential is, even if your team in your business unit or your business isn't ready to fully adopt it? Do you understand it, what's available so that you can start to have a plan for how do you bring it in? How do you remove those human bottlenecks and block us to productivity? How do you get all the people in the room with aligned expectations of what the technology can bring to the business, all on the same page, all ready for being a part of it and putting their own effort in. The last thing I'd say is, are your peers and how are they involved in challenging and supporting robust conversations that need to happen at that level so that you can get the outcome from the collective approach others working on the project so that leadership level is aligned and then moving in the right direction and they are challenging and supporting at that level. Because you might think you've got a thoughtful approach and you understand the bottlenecks, but you can't do it alone and you can't do it in isolation. There'll be no doubt things that will cut across the business in different teams. So how do you get your leaders aligned? We work a lot with leaders in this space around bringing them together and really understanding moving forward the type of teamwork that's required because the teamwork that we've had in the past won't necessarily serve us moving forward. Intelligent technology requires leaders and executive teams and senior leadership teams to be thinking differently about how they perform together in a way that supports the future of the workplace that they're creating and they're curating and they're leading to get the outcomes that they need. So that's what I'd be challenging leaders to think about as they think about managing expectations.
How do they know that they're doing that really well because they're checking in with their peers? Yeah, and even at its simplest form, I would ask the question, how do you know what someone else's expectations are? Have you asked them? Have you flat out bluntly asked, what do you expect to see from this intelligent technology or project or whatever we're putting in place? And really delve into a bit more deeper around those assumptions. So what assumptions do these people have? And even ask them, what assumptions are we making as we pull this together and plan this out and put it in place? Because that's the only way to get the clarity around people's expectations and the assumptions that may not be true, that may need to be tested and validated. A good way to do this is also to check in what does success look like? Because everyone's going to have a different lens potentially.
And that's another great way to check expectations. Yeah. We were supporting a tech implementation recently where they'd already rolled something out and they'd had 10 % adoption. And when asked what good looks like in their eyes in terms of what could we achieve, they were talking about 30%. Whereas we pushed them up to, well, we think you can actually get to 80 % if you do all the things that need to be done and not leave it up to chance. If you're actually thinking thoughtfully and planning out and putting effort in the right things, then you can get to 80 percent. But it is that getting clear on those expectations and what good looks like, what success looks like and what is it. And it's not usually we want to get to 100 percent, we want to save 40 million dollars. It's usually let's start realistic and what can feasibly be done and What are the steps that are needed to get there and are we willing to put in the effort for those steps? And if we're not, then you dial it back a bit in terms of what does success look like? But if you are, if you can put those steps in and that effort in, then you can get to 80 % if you want to get to 80%. So in summary, I think it's get clear on what it is that you're trying to solve and then look at what can help you solve it, which technologies can help you solve it and then critically analyze that, understand what your expectations are and the expectations of your people, how can you get them involved in the delivery of that? I'd also add to that to start small to a proof of concept, really sort of interrogate it from a number of perspectives around that from the technology perspective, but also from the people perspective. And what are your people thinking and feeling through that, through the process of that proof of concept you get a really good sense of what that means short term and long term. So there's some practical things that we can encourage leaders to do starting today when they're thinking about managing expectations of intelligent technologies in their business. There's also more resources on our website for leaders who are a bit more curious and want to go a little bit further and take more action than what we've suggested today. So thanks for listening.
Humans and AI in the Workplace is brought to you by AI Adaptive. Thank you so much for listening today. You can help us continue to supercharge workplaces with AI by subscribing and sharing this podcast and joining us on LinkedIn.