At Computer Aid, we have a saying that you can’t evaluate what you can’t measure. We’ve developed a whole series of measurements for our onshore teams that we regularly report to customers. But what metrics should be reported when there is an offshore team involved?
In his article “Tips, Tricks and Traps of IT Offshore Outsourcing”, Nick Krym provides a list of both application support and application development metrics. We will only focus on the application support metrics, because that is a focus of our blog. But we encourage you if you’re interested to review the article for his list of application development metrics.
He recommends the following measures be reported to customer management monthly with associated reasons:
- Average turn around time for issues by priority: This metric ensures service level agreement (SLA) compliance.
- End-to-end response time for issues by priority: This metric ensures SLA compliance.
- Resource utilization: Helps the service provider and customer understand resource utilization and allocation, as well as team effectiveness.
- Defect removal efficiency: This generic QA metric helps evaluate the quality of the deliverables.
- Effort to Actuals variance: This metric evaluates the estimating process and ensures SLA compliance.
We think he has oversimplified the measurements, and there is so much more on which an offshore team can be measured. For our offshore outsourcing application support projects at Computer Aid, we use our Tracer tool just as we do for onshore engagements to track and measure our progress. Although it is customizable, it typically tracks nearly 30 service level goals and provides nearly 60 different metrics reports.
So what do you think? Should offshore teams be held to the same metrics as onshore teams? What metrics should offshore teams be evaluated against? Share your thoughts with us.