Training needs analysis (TNA) is the process of systematically identifying the gap between the capabilities an organisation has and the capabilities it needs to achieve its strategic goals, then using that analysis to prioritise learning and development investment. It is one of the most important activities in L&D and one of the most frequently skipped or done superficially.
The consequences of poor TNA are visible in most organisations: generic programmes that do not address real capability gaps, learning that participants find irrelevant to their actual work, and L&D budgets that are difficult to justify because the link to business outcomes is unclear.
The Three Levels of Training Needs Analysis
TNA is most useful when it operates at three levels simultaneously.
Organisational level analysis identifies the strategic capability requirements driven by the organisation's goals and direction. If the organisation is moving into new markets, shifting its delivery model, or navigating significant technological change, those priorities should drive the top-level L&D agenda. This requires L&D professionals to understand the business strategy well enough to translate it into capability terms.
Job or role level analysis examines what specific knowledge, skills, and behaviours are required for effective performance in key roles, and how well the current population meets those requirements. Role profiles, competency frameworks, performance data, and structured conversations with senior stakeholders all contribute to this picture.
Individual level analysis identifies specific development needs for individuals, typically drawing on performance reviews, 360-degree feedback, self-assessment, and manager input. This level of analysis drives individual development plans rather than programme design.
Methods for Gathering TNA Data
Effective TNA uses a combination of data sources rather than relying on any single method.
Surveys and questionnaires are efficient for gathering input from large populations. They work best when questions are specific and behavioural rather than asking generic questions about whether people feel they need development.
Interviews and focus groups provide richer contextual data and are particularly valuable for understanding the business context behind capability needs. Speaking directly with senior leaders, line managers, and high performers in critical roles surfaces nuance that surveys miss.
Performance data, customer feedback, error rates, and operational metrics provide objective evidence of where capability gaps are affecting outcomes. This data is often more available than L&D teams use it.
Skills gap assessments and diagnostic tools provide structured capability baselines. Our [L&D Programme Diagnostic](/tools/programme-diagnostic) helps L&D teams assess programme design readiness and identify where their current approach may be leaving performance on the table.
Put this into practice
Take the undefined to benchmark where you stand and get a personalised action plan.
Prioritising and Translating Analysis into Action
Not all capability gaps are equally important. Prioritisation should be based on two factors: the strategic importance of the capability gap, and the degree to which learning intervention can close it.
Some gaps are closed more effectively by other interventions than training. A management practice gap driven by unclear role expectations and poor performance management processes is not primarily a training problem. Role redesign, structural change, or selection decisions may be more effective than a management skills workshop.
Where training is the right intervention, the TNA findings should drive decisions about format, audience, content prioritisation, and measurement approach. The link between the learning design and the identified business need should be traceable and explicit.
A well-executed TNA also creates the baseline against which the impact of development can be measured post-programme. If you identify that 40% of managers lack effective coaching skills before a programme, measuring again twelve months later gives you evidence of the capability shift the investment produced.
References
Boateng, R. (2010) 'Learning management systems: a look at the big picture', in Reynolds, J. (ed.) Workplace Learning. London: CIPD.
Goldstein, I.L. and Ford, J.K. (2002) Training in Organisations: Needs Assessment, Development, and Evaluation. 4th edn. Belmont, CA: Wadsworth.
Phillips, J.J. (2003) Return on Investment in Training and Performance Improvement Programmes. 2nd edn. Amsterdam: Butterworth-Heinemann.