How We Pivoted From SMART To OKR- Part 1

Introduction to the OKR framework:

the OKR frame requires no introduction to performance management systems (PMS) within organizations. Large and small companies are already obsessed with this idea of ​​setting ambitious goals, seduced by the novelty it brings to the way of evaluating performance. It’s perfect to have measurable results that can be objectively quantified. And then the concept of associating key results with initiatives is what we all like to jump from the bottom up when we have an objective overview.

While the concept is exciting enough to pivot from the legacy PMS, execution is just as difficult. AT PromptCloud and JobsPikr, we introduced OKR last year as we tried to reorganize the way performance is measured, both individually and between teams. We were also looking to evolve our approach to align company objectives with individual objectives, in order to improve the visibility of our organizational objectives. And later, to fairly compensate our people for achieving these goals. None of this was possible via our old system where we had SMART objectives assigned by individual because:

  1. They were too numerous for fear of missing the registration of responsibilities
  2. These measured on a scale of 1 to 5, thus adding to the subjectivity
  3. It was extremely difficult to align them with the business objectives. There have been cases where the company’s performance was poor and yet the individual ratings demanded a significant increase.

The OKR framework:

I came across this concept when examining some PMS executives who help aboard the entire organization on the same boat and I found useful links on the Internet. More specifically, those of Perdoo which is an OKR tool itself. He also had an Excel template to use which caught our eye. We started with something like below.

We kept all public OKRs within the teams for complete visibility and to have an open channel allowing any team to interrogate another team. There were also thresholds set for the desired minimum result on the OKRs for a good evaluation cycle.

At the time of the assessments, we were still having trouble getting the teams to reflect their actual progress. There have also been cases where relatively more agile teams have completely deviated from their original plans. The first cycle of assessments after the adoption of the OKR was therefore not as linear as we thought.

The following reasons appeared in retrospect:

  1. As was evident, these OKRs were fairly top-down, which reduced the engagement of individual teams.
  2. The cadences of the company and team OKRs therefore did not match, but were not fully aligned.
  3. Finally, as with any other change, it was not easy for everyone to adopt this idea. Most function heads have been pressured to make OKRs as the source of truth for their plans.

It was clear that we needed a top-down approach as well as a bottom-up approach for a fully engaged working group. We have also started to define corporate OKRs more frequently to match the pace of the teams. Here’s what it looked like for us from the 2nd cycle of OKR, which took twice as long to freeze.

The 2nd cycle was apparently better. People were more used to the idea and knew it was the only thing that aligned with company performance, team dynamics and individual performance. OKR a term frequently used in internal conversations now. This OKR cycle also had very steep goals for all teams. We were culturally on track with the plan and were confident of sticking to our numbers as COVID approached. And then it happened by throwing us all into survival mode. How we survived the COVID reaction and how we are recovering is for another article, but this brand new OKR concept seems to work well for a small, agile organization like ours.

Stay tuned for the next article in the series- How we use Weekdone to solve our OKR problems.


Related Posts