Skip to main content

Assessment and quality in trade and journeyman examinations

  • Engelsk sammendrag av Fafo-rapport 2023:27
  • Tove Mogstad Aspøy, Asgeir Skålholt, Grete Haaland, Christina Drange og Lillian Gran
  • 17. november 2023

Trade and journeyman examinations are an established form of examination with historical traditions. Passing these exams leads to a trade certificate or journeyman’s certificate, which in turn gives the holder access to work in specific occupations or lines of work. Traditionally, the social partners, through the examinations boards, have had considerable influence on this process. However, the formal framework has now changed. Trade and journeyman examinations are a form of assessment in upper secondary education, regulated by the Education Act and the associated regulations, and formally it is the county authority that conducts these exams. However, examination boards are still responsible for the final assessment, and examination board members’ most important qualities here are formal qualifications and work experience. The formal regulations have room for discretion, and it is therefore interesting to examine how the examination boards interpret and apply this. What happens when the assessment traditions of industry meet the regulatory framework of the education system? How and to what extent established traditions and professional norms impact on the execution of trade and journeyman examinations is an empirical question. In this report, we address three issues: i) How is candidates’ competence assessed in trade and journeyman examinations? ii) What factors impact on the quality of the assessment of candidates’ competence? iii) What role do the curricula play in trade and journeyman examinations? The report is based on a rich dataset: register data, focus group interviews and a survey of examination board members, interviews with the county administration, and observations of exams with follow-up interviews with the examination boards. In the focus group interviews, we selected eight fields: hairdressing, cookery, sales, child care and youth work, aquaculture, industrial machinery mechanics, electrical installation and maintenance, and carpentry.

Examination boards have a large, broad-ranging remit. Around 30,000 people sit trade and journeyman examinations every year. In total, we found 8,000 examination board members registered with county authorities throughout Norway. These members represent different traditions within their disciplines, with their own ways of conducting and assessing trade and journeyman examinations. In some fields, the exam is based more on written tasks than practical tests. In others, it is closely linked to authentic work processes within the field, which involves testing as part of the ordinary production. In some exams, it is primarily the technical skills required in the discipline that are tested, often at a test station. The substantial variation in how the exams are conducted can be partly explained by the different understandings of professional standards but also by practical considerations on the part of the examination boards in terms of exam duration, travel distance and employer facilitation. The examination boards also have to contend with the diversity of training establishments, both in terms of the nature of their business and quality. The level of adaptation to industry varies between disciplines and also between examination boards. Furthermore, different examination boards have different criteria or standards for determining whether a candidate has successfully passed their trade or journeyman examination. The examination boards themselves seek a more uniform assessment practice within their respective fields.

Most candidates pass the trade and journeyman examinations, but there is considerable variation across disciplines

Relatively few candidates fail their trade and journeyman examinations (hereafter referred to as the trade exam). In the period 2018 to 2022, the pass rate ranged from 91 to 94 per cent. The high pass rate in trade exams may have contributed to the lack of focus on the exam itself in research on dropout rates, even though there has been a strong focus on dropout rates in other types of vocational education.

Although many candidates pass the trade exam, the pass rate varies in terms of the characteristics of the candidates. In this report, we perform analyses that examine factors such as age, final grade from lower secondary school, trade, region and candidate type. Specifically, we find clear differences in the pass rate based on candidate type and across different education programmes. Candidate type relates to the different pathways to a regular trade exam, which currently includes apprenticeship, work experience scheme, school-based alternative (for students who are unaible to obtain apprenticeships) and via an employer. Twenty-one per cent of those who take a trade exam after completing the school-based alternativefail, compared to eight per cent of apprentices. If we compare the pass rates with studies in health, childhood and youth services, which have an average completion rate of 92 per cent, we find that the pass rate is significantly lower in the design and craft subjects, restaurant and catering studies and partly in electrical engineering and construction subjects. Technology and industrial production has a slightly lower fail rate in our models. We also observed a certain tendency for Oslo to have a slightly higher fail rate and Innlandet a slightly lower fail rate, even when we control for relevant characteristics. However, regional differences are generally small.

The variations in results may be due to differences in the quality of the training or assessment practices, including in relation to the content of the exams. Analyses of trade exam results, however, cannot definitively explain the reasons; it is not possible to determine this from administrative data. Most likely, it is an expression of both factors. Analyses of register data thus raise a number of questions: if a county has a higher completion rate for the trade exam even after we have controlled for differences in disciplines, grades, age etc., is it because they have a different assessment practice in that county? Or is it an indication that they systematically work differently and more effectively with vocational education and training, either in schools, training establishments, or both?

County authorities appoint examination boards, without much follow-up

County authorities place an emphasis on both professional background and personal suitability when recruiting members to examination boards. They often find recruitment difficult, and advisers use various channels to find new members. County authority advisers would like to be more involved with the work of examination boards but feel that they do not have the time to spend on this. They also want more meeting points between examination boards and the county administration. Our findings indicate that the county administration is generally ambivalent about the examination boards’ assessment work. On the one hand, they say that they trust the examination boards to use the curriculum as a basis when preparing exams, but they are also unsure whether the assessment practice is fair and want a uniform standard across boards and counties.

Background of examination board members varies between disciplines

The survey shows that most examination board members have a trade certificate in the area they work with on the board. Most of those who do not, have relevant work experience or higher education, or a trade certificate in a related subject. Around eight out of ten are active skilled workers and a total of nine out of ten are in employment. Five per cent are pensioners and a few are students or unemployed. The typical examination board member is between 50 and 60 years of age and has been serving on the board for more than five years. The gender distribution largely reflects the gender distribution in the respective fields and the labour market, with examination boards in construction being male-dominated, and fields in health, childhood and youth services being dominated by women.

We identify three different motives for serving on an examination board: 1) Maintaining expertise in the field, 2) Personal development, and 3) The social aspect of examination board work. Administrative tasks related to examination board work and the employer’s (lack of) facilitation appear to be the most significant barriers. In a typical year, members are involved in around ten trade exams, while 12 per cent are involved in 20 or more. The highest number of exams is observed in health, childhood and youth services, and the lowest is in design and craft subjects. Not surprisingly, the perception that examination board work takes up too much time is related to the number of exams a member is involved in. Members in education programmes with a large number of exams are more likely to feel this way. Most examination board members feel that their employers support their work, but not all. Three out of four find it easy to get time off work for examination board activities, while 17 per cent find this difficult. Some members feel that remuneration for examination board work could be improved.

Large variation in how trade exams are conducted and assessed

The execution of trade exams is similar in some ways and very different in others. Common features include informing the candidate about the assessment criteria, and  the entire panel being present at the start and finish. However, we find several sources of variation, including the exam duration, supervision during the exam, use of an educational institution versus a test station, the type of competence being assessed, and how well the exam is aligned with professional practice in the field.

The duration of exams is based on the curriculum in the final year of upper secondary school prior to the curriculum renewal. Overall, the examination boards are satisfied with the duration of exams, with some exceptions. In hairdressing, the board wants more time to be allocated to practical work. It is worth noting in this context that as part of the curriculum renewal, the duration of the trade exam in hairdressing was changed from eight to ten days, from the previous two days. In industrial machinery mechanics, the examination board considers it important to maintain a duration of at least seven days, as it was before the curriculum renewal, and preferably longer. In theory, this will still be possible even after the curriculum renewal, where the exam length is five to eight working days. In electrical installation and maintenance, it appears to be difficult to keep the exam within the prescribed length if the candidate is required to perform a practical task that takes place at a job site rather than a test station. It is worth noting that neither the old nor the new curriculum has a maximum limit for the length of the trade exam in this subject, but it must be conducted over a minimum of six working days.[1]

The exam board’s presence during a trade exam varies. In some cases it is absent altogether, while in others it remains present for the entire duration of the exam. This is partly related to the length of the exam and travel distance. The examination boards generally express satisfaction with their follow-up practices.

How well exams are aligned with professional practice in the discipline also varies. We observed this in our review of trade exam papers. In sales, the exam paper does not appear to align well with how the occupation is practised, especially in the county that bases the trade exam on written answer papers. The hairdressing exam appears to be well-aligned with practice, but the emphasis on working drawings still seems to be an element that is not widely practised in the field. Nevertheless, this is an expressed learning objective for the final year of upper secondary school, which the candidate is expected to be tested on. The aquaculture exam appears to be well-aligned with the activities in the training establishment. In electrical installation and maintenance, there is extensive use of test stations, and the trade exam is therefore not well-aligned with practice, even if the actual tasks performed closely resemble the technical practices in the trade.

Examination board members share insights into several challenges related to the execution of trade exams. This includes factors such as the framework conditions for the work of the board, for example long travel distances. It also relates to conditions at educational institutions, for example training establishments that make it difficult for the candidate by expecting them to work during the period leading up to the exam. This is a topic that is discussed among board members in several disciplines. Challenges are also linked to the candidates themselves. Sometimes, candidates are not ready to take the exam or have not studied the content to be tested. An example of this is in hairdressing, where candidates often have insufficient experience with the aforementioned working drawings – a competency that is emphasised in the trade exam. However, such individual factors are linked to structural conditions, in that the training has not adequately prepared the candidate, or the competence to be tested is not well-aligned with professional practice. Candidates sometimes have diagnoses that place specific demands on the social competence of the exam board. The fact that candidates are different and have varying needs is highlighted as a challenge by some.

As mentioned, there are major differences between the trade exams in the various fields, and this is something we also observed in this study. In some exams, the emphasis is on technical skills, while others prioritise knowledge via a written exam. In two of the four fields we had the opportunity to observe, the examination board did not monitor the candidates’ work. It was present at the beginning and end, and assessed the quality of the candidates’ professional work and competence based on the results and a technical discussion. The observations also show variations in the terminology used in connection with trade exams. While ‘executing’ means performing practical tasks in three of the fields, it means completing a written exam paper in one of the fields.

In summary

Overall, the data in the report show that validity in assessment quality, in terms of what the candidate is tested on and how the curriculum is used, is related to several factors. It can primarily be linked to the field in which the candidate is being tested. It also relates to the understanding of what constitutes core competence in the field and how this competence can be measured. Some fields are ‘self-supporting’ – the examination boards know what is entailed in being a carpenter. In other fields, there is more uncertainty about what constitutes a skilled worker. Combined with an open curriculum, this can lead to greater customisation of the exam to better suit the needs of a specific company. Furthermore, the framework that the curriculum sets for the length of trade exams has a significance. This also relates to how feasible it is to test the various learning objectives within the number of days specified for the duration of the test in the curriculum.

In sales and office and administrative skills, some examination boards test candidates’ knowledge of the learning objectives but not their competence as a skilled worker. This raises questions about validity – what should trade exams measure? If they are intended to measure actual skilled worker competence, where candidates demonstrate their ability to perform good professional work in practice and provide sound reasons for the professional choices made, it could be questioned whether the exams in sales and in office and administrative skills measure what they should.

The survey shows that significantly fewer people than average indicate that the trade exam was a good measure of what constitutes a good skilled worker in the electrical trades. Similarly, significantly more respondents in health, childhood and youth services and construction completely agree that the trade exam is a good measure of whether the candidate is a good skilled worker. One explanation for this could be the extensive use of test stations in electrical installation and maintenance exams.

The review shows that the various elements of assessment quality can sometimes clash. One example of this is the examination boards’ views on the use of test stations, which highlight the trade-off between manageability on the one hand and validity, authenticity and fairness on the other. Examination board members’ perceptions of the quality of exams conducted at test stations are partly related to the subject they are assessing. However, the interviews suggest that considerations of manageability or practical feasibility can influence examination boards’ understanding of the quality of exams conducted in this way. Some examination boards believe that using test stations enables broader testing than if the exam is held in the training establishment. According to the examination board, the training establishment’s activities during the period when the trade exam is to be taken may be too narrow for the candidate to demonstrate sufficient breadth in their competence. In such cases, examination boards consider the test station to be a good alternative or supplement to the exam in the training establishment. This is particularly evident in carpentry. Here, the test station can help maintain the validity of the exam to a greater extent than would be possible in the training establishment. In electrical installation and maintenance, the use of the test station is primarily about the manageability of the exam.

Examination board members are calling for better cooperation in assessment across boards and counties, and a more uniform assessment practice. In some fields, the elements that training establishments prioritise in their training vary substantially according to geographical location. How a uniform assessment practice can be reconciled with the sometimes major differences between the activities in recognised trades could therefore become an important issue in the future.

The interviews show that the personal suitability of examination board members is important for maintaining fairness in trade exams. The quality of the training establishment must also be understood as an element of the principle of fair assessment. The quality of training itself cannot be controlled solely by the exam board; this requires good training establishments and a system for overseeing the training that they carry out.

[1] https://www.udir.no/kl06/ELE3-02/Hele/Vurdering

https://www.udir.no/lk20/ele03-03/vurderingsordning