Measuring performance of software test analysts is one of the sketchy area for QA managers. I have some ideas to set KPI’s to do performance monitor. This requires broader analysis and preparation .
Pre-requisites
- QA Plan – as part of is to get individual estimates and set the milestone
- Clear tasks allocation to individuals
- Individual test outcome review
- Weekly individual metrics
Requirement traceability matrix (RTM) can be invaluable resource in this process. We are so busy doing actual work so need to minimize routine management tasks. This can be maintained on test management tool or spreadsheet.
Key Performance Indicators (KPI’s)
I have listed 5 KPI’s which I think are the important. I have mentioned weight age of each KPI.
Quality Driven (60%)
- Static testing feedback
- Test script design for functional requirements and improvements
- Number of executed test cases and jira issues
- Attention to detail
- Quality of delivered outcome – KPI’s are clear & concise, comprehensive etc
- Effective test data
- Defects found after general release
- Contribution on Process improvement
- Risk analysis
Team work (10%)
- Work within the QA team
- Work across the team i.e. PS, Product team
Project management (15%)
- Prioritizing tasks
- Efficiently managing own work
- Timely delivery
Technical know-how (5%)
- Functional expertise
- Technologies
- Test tools
- Test automation
Personal attributes (10%)
- Result driven
- Provide constructive feedback and accept criticism
- Effective communication
- Proactive
Many KPI’s listed above are challenging to quantify. Defects tracker, RTM, Project plan will be the key going forward in this context.
I am not in favour of keeping defects raised as a KPI. This leads to the situation quality of defects degrades and numbers shoots up.