Today we continue our discussion on the Nine Lies About Work book series with Lie #6 People Can Reliably Rate Other People.
We cover the subtle difference between feedback given about an individual against an obscure set of criteria - with an arbitrary numerical scoring system - and the professional opinion and experience of the leader discussing how they experience/perceive the behaviors and performance of their directs.
We believe the latter approach, recommended by the book, is a great starting point for more effective "performance management" within an organization. Working on an individual, context rich, and nuanced approach for your team will yield more dividends in the future than a yearly infrequent scoring against standardized criteria.
Thanks for joining us today and don't forget to hit the subscribe button or reach out at [email protected].
Charles Knight 0:05
Are you recording?
Robert Greiner 0:05
Yep, just hit record. So we're on episode 39.
Igor Geyfman 0:10
Yeah, apple event?
Robert Greiner 0:11
No. Lie number six?
Charles Knight 0:16
I was gonna say Apple event I'm out
Robert Greiner 0:18
Igor's got Apple on the mind.
Igor Geyfman 0:22
Well, I'm just I'm so excited about that darn iMac.
Robert Greiner 0:26
Did you already order one?
Igor Geyfman 0:27
I can't, I can't buy one. I wish I had someone to buy one for the kid or something like that. I definitely be like,
Robert Greiner 0:34
whether you like it or not.
Igor Geyfman 0:38
But, but I don't have anyone to buy an iMac for I Okay, look,
Robert Greiner 0:43
you're looking at him right here. Man.
Igor Geyfman 0:44
This is bad talking about
Charles Knight 0:46
I'm a kid. Yeah, I'm a kid at heart.
Igor Geyfman 0:49
We could solve Charles's computer problems. Yeah. But buying him an iMac.
Robert Greiner 0:54
You should do that. And then it would get him into the apple ecosystem. Yes,
Charles Knight 0:57
would. Yeah. Yeah, it really would. That would be the way to do it. Because I'm not going to do it willingly.
Robert Greiner 1:03
There you go, Igor.
Igor Geyfman 1:05
yeah, we're just you're just gonna tell me if you like, what color you like.
Charles Knight 1:09
What are my options?
Igor Geyfman 1:10
Well, there's seven different colors now.
Charles Knight 1:12
I wouldn't say that. I don't want to do that.
Robert Greiner 1:14
just pick a color.
Yeah, when he just what do you want? Man? It probably already exists.
Igor Geyfman 1:17
Yeah, well, color of the rainbow.
Charles Knight 1:19
So when my kids asked me what my favorite color is? I will give you that answer. And it's purple. Purple is a purple one.
Robert Greiner 1:25
Nice. boom, done. It's on your doorstep.
Igor Geyfman 1:30
I have an iPad problem.
Robert Greiner 1:31
Yes. Yeah. That's an understatement. So just be honest with our listeners. Have you already clicked buy now? Since you have line of sight, selling your iPads Don't lie.
Igor Geyfman 1:43
I've gotten, you can't order them. All right, I can pre order pre order. But I have already basically taken it to the maximum possible step. Like I've configured it and been like, no big deal. Yeah, you're
Charles Knight 1:58
I'm sorry, I can't stop laughing.
Robert Greiner 2:00
that we appreciate this about you
Igor Geyfman 2:01
that the new screen is just is I think it's too good to pass up a bunch of performance in it, like most reasonable people would be like, why would you buy another iPad, which is basically identical to one you already have? I think it looks like the new 13 looks identical to my old 13,
Robert Greiner 2:19
which is still quite new.
Igor Geyfman 2:20
Maybe the camera bump. So anyway, most reasonable people would just dismiss the thought. But I was ready to hit the buy button.
Robert Greiner 2:27
Line number one Igor can survive with less than three iPads.
Igor Geyfman 2:34
Oh, lie number one, but line number six yesterday, right?
Robert Greiner 2:38
I'm excited about this one.
Igor Geyfman 2:39
It's a good one.
Robert Greiner 2:40
Yeah. And I found myself agreeing with most of what they said.
Charles Knight 2:46
Igor Geyfman 2:46
we just came off of review season. So yeah, this is like sacrilege, because all we've done for the last three or four weeks is judged the performance of others.
Robert Greiner 2:55
That's right. But the way we did it, I think is actually quite aligned with what they recommend in the book.
Igor Geyfman 3:02
Yeah, I think you're right.
Charles Knight 3:04
Can we state the lie because I'm at a loss here as to which lie we're talking about? I thought it was one thing, but now it might be another?
Robert Greiner 3:10
Yeah, yeah, the lie is people can reliably rate other people. And the truth, the corresponding truth is people can reliably rate their own experiences. So it would not be fair, in their argument to say, Igor, you are a three out of five on your ability to act professionally, or a four out of five on your ability to write business narratives. But what you could say is, as a leader, I interpret I perceive I experience Igor, as a high performer that I would trust with any deck and delivering and creating and delivering any narrative to the senior most people that I work with,
Igor Geyfman 3:51
Or I would have a hard time trusting or to behave professionally, in in a high stakes environment, conversely,
Robert Greiner 3:57
which we talk about a lot, I have people on my team, I think of first because I have a familiarity with them. And I have a trust based on us working together. And new people that come on the team, even if they come highly regarded. I have to see it for myself a little bit and stuff and provide feedback and make sure that there's alignment long term. And when I give feedback around reviews, and performance review season, it is structured in the here's what I witnessed, here's what I observed. Here's how I interpreted what I saw, which I think is fairly aligned with the book. So I'm not too I'm not too upset about the way that it was presented.
Charles Knight 4:36
Where does the survey stuff come in? And where they nerd it out about that?
Robert Greiner 4:40
Yeah, it's peppered throughout. So they get into distribution of standardized scores and how if you people have quirks with that when they try to rate things and so Charles, if I give you a one to five scale and you give me a one to five scale, I may only use three, four or five and you may use one to five So right, and they talked about, okay,
really in depth.
Charles Knight 5:03
So it's railing against some companies that use, like a number rating scale to figure out who their top performers are and how to allocate bonuses. And
Robert Greiner 5:13
yeah, Yes, exactly.
Charles Knight 5:14
Igor Geyfman 5:15
Especially with like force curves. And those. Yeah, let
Robert Greiner 5:18
me maybe start with a story so that this happened to me my first job out of school, like this whole forcing the curve thing, like I was part of an organization that did that. So the leaders of the organization, I worked for large, like global company, they determined that 10% of the workforce must be bottom performers, I think they cited Jack Wwelch for something like that is basically if you're in the bottom 10 10%, you're done, you're out. But only 10% could be top performers. And I think our experience, if you have a standard would tell us that there's probably a bell curve, the hard, fixed 10% rule, at any given time, that doesn't seem to make sense. But that's what they said they wanted to do. What's worse, though, is they forced those metrics on each team. So you had to have one low performer at least, and one high performer on each team. And if you had a small enough team, that was all you could have, right? If you had a team of 20, maybe you could have two top performers. And if you didn't get that coveted, exceeds expectations rating, which you could only hand one out of you couldn't get promoted for a whole year. So if you had a team of 10 people, you were really invested in them as a leader, you're helping them grow their careers, filled with really strong, smart people who were for the organizational purposes. overperforming, you were handcuffed when it came time for career advancement. And not only that, you had to pick a lowest performer who was at risk of losing their job. And so it's one of those, hey, it could be three years before you get promoted. Because by the time you know, this person has been at level long enough, this person just came off of a really good delivery, like get in line and maybe three years from now, it'll be your turn to get promoted. So needless to say, I left that job.
Charles Knight 6:58
Yeah. Because you were a low performer. Is that the truth?
Robert Greiner 7:01
Yes. Forced to leave, that's right
Charles Knight 7:03
the truth comes out. Igor I had no idea now. No, no
Robert Greiner 7:07
perennial low performer? That's right.
Igor Geyfman 7:09
Well, I can and other jobs also have like slots. Right. So for promotion. And so you have to have if you're a director, and you your next promotion is to senior director, a lot in a lot of organizations, there has to be a senior director slot, that becomes free through somebody leaving, getting promoted, whatever. And then, but there's usually multiple people that are vying for that slot when it's made available. And who knows, that's not always clear when that happens. And so yeah, that that creates another limitation to the force curve as well, because it's not even a force curve with unlimited slots force curve with limited promotion availability.
Charles Knight 7:53
Yeah, it's interesting that they're, I guess, they chose to try to attack that mental model. That's a good thing. I like the the force ranking and the limited slots. And they're trying to attack that mental model by trying to discount I guess, the validity of like quantitative ratings. Is that accurate? In terms of what
Igor Geyfman 8:17
The way I read it, it was like your rating is valid, because that's your experience with that person. But that rating is not a valid rating of that person. in general.
Robert Greiner 8:28
Yeah, it does not represent the person your rating, it represents your
Igor Geyfman 8:33
perception, and that perception is contextual, just to you.
Charles Knight 8:37
Yeah, that's the same thing as saying,
whatever you're feeling, whatever your experience is of an event, that's true and valid, that isn't like it's a subjective experience, right? Isn't? Are they saying that's the same thing like it? And we should not equate our experience with absolute truth? Or what's the
Robert Greiner 8:56
Yeah, they're saying it does not represent the person that you're rating, it represents your experience of the person that you're rating and should be considered as such?
Charles Knight 9:05
Yeah. So should not be at the end all reason why somebody is promoted or fired, or,
Robert Greiner 9:11
yeah, and they're pretty open here and say, that's the best we have, we really don't have a good answer here. And you have to rely, you still have to rely on the manager to have an honest, solid assessment. And that's really the theme here keeps coming back to the individual manager, the leader, the role of the leader to their direct reports is so vital, and everything breaks down if that's not effective, right? People if you're on a bad team, you have a bad manager. People rate the company poorly. They, we don't think the company is headed in a good direction. Like we've talked about all of those things. It all comes back to building great relationships with your team, helping them grow, providing coaching, feedback, delegation, all the things that we talked about before that we've mentioned, manager tools a bunch of times those behaviors If the leader of a team no matter what size, a team of individual contributors or a team of vice presidents who have multiple layers below them, it all comes back to the effectiveness of that leader to their direct reports and how good of a job they're doing there makes all the difference in the world.
Do they talk about 360 reviews at all, or you get people from peers or feedback from peers?
so they say it, they basically say that it's garbage. So if you take a bunch of rough approximations and average them, that works in some things, like if you're trying to guess someone's weight, and you have 100, people guess, the weight of a human being, like, how much they weigh, or how tall they are, if you take the average of everyone's guesses, you're gonna get something really close. Because every it's a finite, it's a fixed thing, like weight means the same thing to us here in the US, if you're trying to convert between kilograms and pounds, you may have a problem, but it's roughly a fixed thing. units of measurement, when you're trying to say something like business acumen, that can mean so many different things to different people that when you try to take the average of those, there's too much noise in the signal. And the more noise you stack on top of each other doesn't do any good. So they're saying the 360 degree feedback approach, the consolidation of feedback, and the averaging of it is not effective.
I think the there's some nuance there that's worth teasing out. My think because on the one hand, at best, you can only get an approximation by getting a variety of different viewpoints from people about someone whose performance it's you. It's true, you can only approximate and and when you first started saying, Oh, yeah, 360 just doesn't work. Like isn't that what we essentially do? But I think the reason why it works well, for us maybe not perfectly, is because we have such a shared understanding of what this business accumen mean, for example, like we have our expectations framework, which is defined thing that everybody can see that everybody has experienced with, and there's debate and discussion about what each of those things mean, would you say because of that, our 360 approach is better than I'd say, I'd go ahead and say, is good. I feel like it is.
Yeah, I think so because we have a mentor dynamic here, where we're consultants, we do projects that are short ish in length, right, we don't do a lot of staff aguaje, just sitting on a project for years and years, there's a lot of dynamics in where people are working, and who is their direct supervisor or manager. And so what we do to mitigate that is we assign people a mentor, and that's their career coach, career advocate. But also your mentor writes your review, twice a year. And it's very formalized, it's very structured, we have a framework with 80 something rows in it. And so it's the review is very much and we talked about this before, right? The map is not the terrain, it's a directionally correct synthesis of performance over a six month period across five dimensions that break down into 82 sub dimensions. And largely, that's fine. But the thing is, the mentor who very rarely is on the project team with the person they're writing review for, goes in collects, goes through an exercise to collect a bunch of feedback, which is a heavy investment in and of itself, synthesizes and interprets that feedback, and writes their own perspective, their own interpretation, and career recommendations based on what they've heard.
And to the expectations framework that has been objectively written. Yeah.
But we don't say you are a four out of five on business writing, we say in the last review cycle, you wrote a deck for the CIO. And you got really good feedback from the client on the clarity of the messaging and the aesthetics of the deck, and it was passed around the organization, and there was some momentum and buzz around it. And so the thing you created was helped influence this broader set of decisions. That's a great data point, or, hey, you really struggled to get through this deck, and it took you twice as long as it should have, and you had misspellings. And when you went to deliver it, you were inconsistent in your delivery. That's a behavior that I observed and you need some work before, I'm ready to trust you as your manager to put you in front of client leadership again, but that's still what you as the review writer, as the mentor would go and write down and recommend and suggest for improvement and coach your mentee on and work with the client manager to make sure that happens. That's all aligned with what the book is saying is like, it's around the your interpretation of that person. You're not saying hey, you're a three out of five and therefore you're in trouble.
where I'm always at a disadvantage because I haven't read the book because whether it's a whether it's a number or a here's my experience of you, it's still subjective. And I say I think what makes the mentors good is the whole system. Why I think ours works. It's not just the mentor. It's not just the non rating. It's also but I think the secret sauce is the expectations framework, that we have a very robust shared understanding of, because it's, I think it's I think it's, that's the game changer for us. And
there there is that where they were saying in the book, one thing may mean a range of definitions to a group of people. And I think we do put a tremendous amount of effort into getting very clear and granular about what each thing means
And then we get a sample size of a lot of people going through the same, like cohort rank. And so I think that's and we don't try to break down c plus really well, or Java really well, or AWS really well, like we don't get into that granularity in our expectations framework. It's around problem solving. And you can apply that to an accounting problem or a programming problem or a strategy problem. And so I think we have a little bit of a benefit in the way that it's written. But I think you're right, we have a deep shared understanding that design exists most places, and I don't think you can really rely on having people create that because it takes a lot of work. And I just don't see it out in the wild that much.
Yeah, but I think that is the maybe as a, okay, now, let's think about what advice to give to leaders, managers of teams that don't operate in the Nirvana environment that we do, because we've got it made. Because we've got all the right components in place. I think my advice would be, regardless of what performance management system you have in place, I think my advice would be really clear through your one on ones and all of your interactions with their team on what your expectations are of them. And you should be talking about, what does it mean that business acumen, even if there isn't a shared organizational definition, like there is for us, which by the way, we still have to interpret and debate it's like a living thing that has to be reinterpreted over time. I think that's my takeaway is that regardless of how you rate somebody and evaluate their performance, the best thing to do is just to get really clear on what your expectations are, like in this meeting, when you interact with these clients, I expect you to do A, B and C. Yeah. And if you don't, that's a bad thing. If you do, that's a good thing, do more of it, which ties back to our feedback conversation. But yeah, obviously feeds into how we rate people's performance too.
But think about how we do it, there's a key message in there. It's a narrative, right? It's a career narrative where we say, Okay, in this dimension, we're going to write three paragraphs around some examples of what happened over the last review cycle, what you demonstrated maybe where you should grow, and then you always get two or three sort of career focus areas at the end, which are here, then the most important to prime first among equals things to go and work on and represents what's next for you in your career. All of that is around like a narrative, a key message that is concise, and direct, and anchored back to a consistent standard. But it's it's an interpretation, right, you could have four people at the same level, on the same project, doing the same types of work with a career development point, in the same area that says something different based on where they're at in their careers and what you experienced, right, the wording could be different, there could be some nuance here, one person may be really good at solving the problem, but not explaining it the other person, vice versa. But it may all be in the problem solving space. And so that I think the importance here is that you focus on those key messages and you like you're essentially giving someone a map a career map, you're not quantitatively retroactively, like scoring them across a set of criteria that are missing, interpreted wildly differently between leaders. I think that's the anti pattern during the day.
And sometimes we'll do that internally too. And that's when I get that's a trigger for me and some of our reviews, where we'll start using terms that are not quite in the EF and so we'll say, Oh, this person doesn't have enough executive presence, or something like that. And that always triggers me to what the hell does that mean?
Igor, you get triggered over this use of terms? Yeah, call something if it's not that thing.
I'm very exact with my language, usually, although I did make up the word lessest. At some point.
That was great.
That was great word.
It just wasn't a real word. But it was precise.
It was precise word that I made up
day William Shakespeare. That's what
I told you. flattery will get you and
I really want try that
not the person I'm trying to pull from your project to make my life success. my professional life successful. I just rather have your iPad, please.
That's right. And so it's those terms. It's The labels that we put on people that encompass a bunch of different behaviors that then follow them around. And that's the stuff that really gets me triggered. That's where I'll usually intercede and start asking questions. Like, what is it about this person's executive presence that you feel is insufficient. And I think that's also what the book is trying to do is don't label people in broad terms, I was having low leadership, or a lack of emotional intelligence, or whatever
Yeah. And to give it a counter example, we just started in a new area in December, and one of the early people on the team was I had not met them before. And we had this key client meeting. And it went really well. And she came like really prepared. And I remember after the meeting, thinking, Okay, I can trust you in front of anyone that we just talked to, without meaning to be there, like my, and I have no clue what how other leaders in the past have interpreted this person, have experienced this person. But for me, in the project that I'm in with the kind of work that we're trying to do, I'm thinking, Okay, we're like, we're good here. And I have a, maybe a more clear idea on how to be supportive, and where to provide coaching and feedback and those types of things. But it is a it does come from like an intuitive place of, Okay, I feel good, I feel comfortable with this, and not, you're a two out of seven on this obscure matrix. And then Not to mention, some of these performance evaluations try to consider potential as well, which is you're trying to map you're trying to match like historical demonstrated performance, which is time bound. And within the scope of a review period, you don't have the time to demonstrate all the things anyway. And the argument in the book, which I agree with is no one's really watching you that closely to begin with. And then second is how can you make a quantitative score of potential with one person as it relates to the other when they're in different phases of life, and they want different things and doesn't necessarily mean their career growth is the same or different. And so that that part gets a little weird, as well, when you start to try to contemplate potential as a layer on top of these weird performance metrics.
And there's a couple of things I'm thinking about one, which is probably not worth getting into. It's the limitations of our language. And our language is English. But this holds true, I think, for any human language. And the limitations are, the words themselves are just approximations of subjective felt experience of something. So like, we're constantly grasping for words to describe our experience. And in the case of trying to describe our experience with other humans, it's inherently more complex. The other thing too, have you all heard about? This was probably years ago, where there was this buzz around getting rid of performance reviews, I guess they just don't work as well.
Yeah. And a lot of companies did get rid of their performance reviews.
Do y'all remember the argument for why and what they did instead? Because I don't know enough, unfortunately. But
I feel like the biggest argument there was that annual performance reviews are not timely enough to make any sort of difference. And they're meaningless. And so a lot of companies invested into equipping their managers for more like just in time feedback.
Yeah, the typical, it's costly, it's dated and stale. It doesn't include the full picture. It's demotivating. It runs the risk of undervaluing people. And it's hard to contemplate, like the intangibles like Charles, I've used the word transcendent to describe you before when you give them feedback for your review. Like how do you measure that? No, it's it transcends it
says the 90th row. model. Yeah, I'm good.
I'm gonna need, you know, my, my Well, you level,
I'm sorry to say, but you actually can't access those hidden rows. I'm sorry. So anyway, those are like there's several arguments against it. Yeah.
And but the book, I think they're This is the most honest chapter and that they, they're like, we really don't have a good answer. It's just let's calibrate and call a spade a spade and say, Hey, that this is this feedback is my interpretation. Do they talk in the book at all about people rating themselves, like their performance themselves? Because I know that there are other companies out there that experiment, which by the way, the fact that they're honest, there's not a lot of great science and research to show what we should do instead. That's great. That also means that there's room for experimentation to try to find something better. And I know that there are companies out there that sound really extreme, but to me, it's fascinating. And it's do things like hey, people can set their own salaries, and they can rate themselves and a lot of these companies they use AI underneath to try to surface important information to help people make decisions about what their salaries should be based off of their responsibilities that they take on and things like that. Do they say anything about people evaluating themselves from a performance stand point in the book at all?
I think it covers it implicitly, yes, it really is more from the leaders perspective. But that is it that it's a key part too, right? There's two sides of every story. There's multiple sides, every story. And so,
yeah, cuz one thing we do is self assessments. Yeah.
And I'm constantly surprised at how many people don't take those seriously, because that's your only input into the only thing you can control. It's your ability to think about your performance, how you feel like you stacked up and make a case for yourself. And it's also a data point for your mentor or your review writer to see like, how detached from reality is Robert being right now, as you think he's much better than he actually is? And there's like a self awareness kind of thing. That is, it's a data point that can help you measure or assess that out as well.
Yeah, that's actually one of the things that I talked to people about, who want to know more about the, you know, what's it like to progress through our manager ranks, because we have three levels of manager at our company. And there's a lot of different dimensions that I talked to people about, that are outside of our expectations framework, but I think is helpful. One of them is to progress from level one manager to level three, there needs to be pretty significant growth and self awareness in your ability to accurately self assess yourself against our standard expectations framework. And a lot of people look at me funny. But if you're asking, Hey, how am I doing, when you're a vice president, when you have the autonomy, to go and do things on your own, which I know is too repetitive of a statement there, then something's wrong. It's like you need to develop and hone your ability to see, how am I doing? How did I do in that meeting? And that's not, I don't, I think that's kind of baked into a little bit our expectations framework, but I like to make it explicit, and really emphasize that this is a skill that you have to develop. And it's hard, because it requires reflection and confronting some blind spots that we have and some challenges that we might face. But I think that's what we have to do in order to continue to grow at an accelerated
And I will say, I've been here, nine and a half years now, I've had reviews where the conclude the narrative and the conclusion about what I needed to do next in my career, seemed depressant at the time, and I was thinking, Oh, my goodness, you have just put into words, like I completely agree, you put into words exactly what I feel like I need to be doing next. And I could not have gotten there on my own, I needed someone more experienced smarter, more wise than me or group of people to have a discussion about my career and come up with that conclusion. Like I wouldn't have gotten there on my own. There's been times where I felt like deeply appreciative of our performance reviews, and then other times where I'm thinking, hey, you missed the mark. But it's not what I talked about at the beginning of the of the podcast, which is this weird forced curve, top 10, bottom 10% nonsense, like it's, we go a level deeper than that. And I think we are an instance of what is aligned with what the book recommends. The downside is it's very expensive and very time consuming. And we put a ton of energy into that. I feel like we get the dividends from it, in keeping retaining growing really great people fast, and in a way that is sustainable and scalable. It just takes a lot. And it's a lot of energy. And I don't expect most places to get there. And I don't know that I blame them. So it's a heavy burden to bear.
I've had those experiences to where people say something in my review, that just blows my mind. It's Oh, my gosh, I can't even describe it. But I know that I have done that for people as well, as with just a few data points, just like you said, like you had that one interaction with that new person on your team. And immediately you felt comfortable that they can go and operate independently and talk to whomever at the client, it's because we spent so much time thinking about this developing the shared understanding, doing introspection, seeing such a diversity of individuals come through, we have become masters at this, like tacitly we're able to very quickly see these data points and draw conclusions that are mostly accurate, not all the time.
interesting point. I didn't agree with you when you first set it. But now I think I do where for the last 10 years, every quarter, we go through anywhere between six and 12 of these sessions and get really deep on them and frequently gather feedback and write reviews and advocate for promotions or not. And that's been going on for 10 years. And we have people who've been doing it longer, and who are in more review sessions. And so that I think yeah, there's a lot of collective wisdom around this internally, which I think is is a great benefit. We probably don't talk about that enough.
I don't think we Do Yeah, I don't think we talked about how much time it spends and how much of a drag it is on our client.
yeah, it's, it really is mastery in progress. There is a danger, though, because I think we, the more we know, and the more practice we get, the more the higher potential and risk there is to jump to the wrong conclusion. And which is why I love that when we do this by committee, I like the mentor, like you said, writes the review, but it's then presented to a diverse committee, that is kind of a check and balance. And that's where that collective wisdom really comes to bear and helping to ensure that we're not jumping to the wrong conclusions and stuff like that, because we're, we can never get a perfect met. These are only approximations. And I love the whole, the math is not the territory thing that applies here to people's performance for sure.
do get it wrong a lot. We think we have all the data and we make a bad decision. Sometimes you're forced to like try to predict the future, when you're making promotion decisions. In some sense, you're predicting the future that is impossible for humans to do. So you're making a best guess the downside of what we do is people tend to have negative anchors stuck to them for too long after they had, they came in as a college hire like straight out of the University of Notre Dame, or UT or BYU or a&m or whatever. And then they mess up one time at a client, someone gets a bad perception of them. And then three years later, someone Hey, I remember when they did this, what's going on with that, we tend to keep these negative things around you maybe a little too long. So there's definitely we don't do it perfectly by any stretch of the imagination,
tell people, if you're if you really want to do this, you have to really be vigilant, as many sessions that I sit in Skype, I have to be on constant guard to protect against exactly that, which are their cognitive biases. They're, they're juristic that are useful and oftentimes good. But like recency bias, we talked about that a lot. So this person just did this thing. Yeah. But don't let that scuttle the body of work down over the past six months. So it takes a lot of work, man,
you're so right. But it's still worth it.
It is done imperfectly. But there's it I still would argue that it's a it's a positive it's a net positive.
Oh, absolutely. Right. And so
I just can't throw the whole thing out because it's got some structural flaws when there's no better alternative.
Yeah, I don't think there is a better alternative other than dramatically using technology and data to improve things, which is still going to be inexact, right and more art than science there.
Yeah. And also, if you've ever played like fantasy football or something, and coming back to sports, when it goes to performance, there's so much more feedback, so many more things are measured. And it's a much more finite, like pool of understood things, right, like passer rating in that NFL has a formula for it. And there's a maximum, and you go and you do your fantasy draft at the beginning of every year, and you see how people perform the last couple years. And it's like it, sometimes it just doesn't work. And so I think the more data you have, certainly can help you. And maybe this also supports the book, I would like that data to be collected about me. So I could take it and interpret it and make adjustments. I don't know if I want that data taken about me and folded into a unilateral career or performance decision that I don't get to be a part of. And so maybe that's where I draw the line.
Yeah, yeah, I like it.
So yeah, overall chapter could have been half as long 25% as long take all the nerd speak out, which I appreciate. I just that's not what I'm looking for in this particular book. And it's got some pretty sound advice. I think what what's your sort of conclusion on the chapter Igor.
This is another one of those chapters, which I was a big fan of that really, really hit me when I first heard it. Right. I think I mentioned it a couple episodes ago that I was listening to this book, when I was going through a bit of a struggle period. And in this chapter, it really hit along with what the other one as well. And I didn't mind nerdy if you're addressing a broader audience, some of the statistics and stuff like that is interesting. I'm heavy into, like customer research, qualitative, quantitative studies. And so some of the background on the research was actually really awesome to read. But that's because it aligns with my day job. Interesting stuff.
Oh, that's interesting. So you appreciated it as a
as a practice
research practitioner? Yeah. Yeah. Okay. But for a broader business audience who this is saying that I can completely see your point. And a good editor, I think would have discouraged them from including it. Certainly, there's
a level of credibility there around the debunking. I just,
it's too was too much.
And some folks, the green, the blue thinkers, might need that extra detail. While you might just take it,
yeah, anything else you'd Took away from the chapter you are now you said this one hit you hard at a key moment in your career.
So what else,
just the big, the biggest part for me was don't assign like things to people as like a scarlet letter, because it's much more complicated than that. And everybody experiences individuals differently. And I might be a three out of five professionalism for Robert, but a five out of five for Charles, when I work with him. And that might be for a lot of different reasons, not just, you know, higher levels of standards, and so on. So that was the biggest takeaway for me is, you can very reliably express your experience with another human being. But I think it's very unreliable, objectively, rate them as a person in any given area, unless there's an objective measure that goes with it, like a quarter mile time or whatever.
Yeah, yeah, totally agree. So on that note, I think we can have some pretty clear guidance, if you're in a leadership position, whether you're in these sort of old school numerical, quantitative, unclear, performance review processes, and governance structures. When you're working with your people. Think about the key messages, think about the successes, opportunities for growth, and what Where do you want to see what's next for your people? Is it Have you been giving feedback and working on having conversations around growth in these areas is anything coming in here that should, that's a surprise, that's not a good sign. And really think about this as a map. It's not the terrain, it's a directionally correct synthesis and narrative around your team's career growth that is tailored to them, and will help them get to the next level, double down on strengths, remediate some weaknesses or opportunities for growth, if needed, and focus on that level. And from a position of here's how I experienced you, when you did these things. This is what I interpreted. This is how I think you could continue to do that or get better. And I think you'll be in a good spot.
I know, I didn't read the book. But I guess the last thing that I would say is that performance evaluation should be a dialogue amongst parties. Not done unilaterally. Robert, you said unilateral earlier. And I really liked that because it should be a dialogue between the person being evaluated. And the manager and the mentor. And objective third party is like the more people that you can have a dialogue with about, you know, a person's performance against whatever expectations you you measure them against, probably the better, because I think, in my experience with people who have been rated quantitatively, it's like, Where the heck did that number come from? Like, how did they arrive at that? And that's like a, that's like a code smell. Right? That's when you know, things have broken down significantly, is when when it's done in a vacuum, and there's no conversation or justification for it. And I would say, I'm so glad that I'm at our company, because I don't, how would I explain why I rated somebody a three or four, without something like an expectations framework? Like I'd have to look them in the eye and say, Hey, I'm sorry, I had to pick I had to pick because the company policy is there's got to be at least one underperformer one over performer and then I would feel terrible in that situation.
I feel helpless. I just if you're a leader feeling that way, come talk to us because maybe you should come work for us. Because you'll never be put in that situation.
Charles you're so right around the dialogue that that was really well put. I like that a lot. It This is not an asynchronous unilateral thing that you just go and create and throw over the wall. And then that's the truth for always in forever, and it's in a permanent file. This is an ongoing discussion, just like you would have with any other core relationship in your life. This is an ongoing thing. And it requires tweaking, right not big swerve adjustments. little tweaks over time is the most effective path forward. I love how you put that I really, that really helped me. Thank you, Igor. Any closing thoughts? I saw you went off mute.
Charles A transcendence, five out of five score. Excellent. Yeah, I was like, Don't make fun of Charles.
I thought that would have been perfect. You should have done it. We'll count it as potential funny.
I'm flattered, Igor.
I'll talk to your mentor. Yes, you are any closing thoughts? Man? We're getting close. We're over the hump, where two thirds of the way through the book. We've only got three lines left. I think we've gone through some of the kind of juicy or controversial hot take ones. I think it's a nice smooth road home from from here on out.
Yeah. So just looking at it now. People have potential work life balance matters most. And then leadership is a thing. So where
leadership is a thing?
yeah. Probably my favorite chapter in the book, and something that was pretty eye opening to me and it talks about qualities of a leader. Here's what it means to to be a leader,
leadership potential and
I'm looking forward to coming down the stretch with y'all. And I really enjoyed these discussions that they've been really insightful and helpful. So thanks for taking the time.
Have a good one. That's it for today. Thanks for joining and don't forget to follow us on Twitter @wannagrabcoffee or drop us a line at [email protected]
Join us for weekly discussions about careers, leadership, and balancing work and life.
A podcast about all of the topics we discuss during our mid-day coffee breaks. We bring you stories, thoughts, and ideas around life as a professional, leadership concepts, and work/life balance. We view career and leadership development as a practice that spans decades and we are excited to go on this journey with you.