Project Engineer, UL

December 12, 2019

Sign up to free email newsletters

Download

Whitepaper: Effective Techniques for Robust OT Security

Fire Prevention

The importance of independent certification and competency in fire performance technical assessments

FireSafety-technical assessments-19The reliability of engineering assessments on fire performance in the construction industry has come under scrutiny recently, with the events of the Grenfell tragedy and the Bolton student accommodation fire still fresh in the mind. Here, Simon Ince discusses why assessment reports are both necessary and vital to the fire safety industry, and highlights the need for quality, competent justification when carrying them out.

“You can’t test everything!”

This statement might sound like lunacy when talking about products that have a fire safety performance requirement. However, it is often a reality, and where the same product has multiple variations such as, diameter, thickness, weight or even colour, the variation permutations could be almost endless and impractical to test. For example, if a decorative vinyl coated wallpaper comes in three different weights, and has three different patterns, in three different colours, the number of different variations to test would be 27. Add three different shades of the three colours and the number of variables increases to 54. Add in to that equation the three EN 13823 fire test samples that need to be burned, to provide a valid test result for each permutation, plus the minimum of 18 small scale EN 11925 flame application tests per permutation which are also required, and the number of individual tests would be in excess of 1,000. It would become prohibitive for manufacturers to test all their variations.

With this in mind, there has to be a way of range assessing products to provide a test regime that will cover the fire performance of all variations. Within Harmonised European standards for products (where they exist) there is guidance on range assessment; and there is also European guidance on providing extended application assessments for fire resistant and reaction to fire products and systems.

However, standalone engineering assessments do have to be used in some instances to allow manufacturers to limit their test commitments, whilst still offering a high level of assurance that the full scope of variations will perform the same way in a test and gain the same fire test classification.

Assessments are also used to justify variations that occur on site which may have some changes from the ‘as tested’ specification. For example, fire resisting glazing can be produced in larger sizes than can be tested on a fire test rig. So, in an end use application, an engineering assessment has to be made that the increase in size will not significantly affect the fire performance of the glazed screen.

 

Only when necessary

It is important to note that assessments should never be used to avoid testing. Engineering judgements should be based on test data and completed by experts, using sound engineering principles. If there is test evidence for the most onerous design a judgement can be made to cover those deemed to be less onerous designs.

“The immediate suggestion from government was to ban all assessments and test everything, though it was quickly pointed out that this was simply unfeasible.”

Unfortunately, this has not always been the case, and the validity and reliability of assessments has come into question following building fires, where assessed products and systems have failed.

Indeed, assessments are something that the Government (via the Ministry of Housing, Communities & Local Government) has expressed concerns about following the Grenfell tragedy. The immediate suggestion from government was to ban all assessments and test everything, though it was quickly pointed out that this was simply unfeasible. There is not sufficient test lab capacity to do so, it would prevent thousands of products from being available for use and delay new products from entering the market; increasing costs for the entire supply chain.

Primary test evidence should be the ultimate in assurance – what was tested is exactly the same as what is going into the building. However, that isn’t going to happen for every specification of every product.

 

Defining a reliable assessment

One way in which the end user can have confidence in an assessment is where a product has been certificated through a third party certification scheme. Accredited Certification Bodies complete independent conformity assessments on products to provide confidence in that product or a system’s fire performance.

The end scope of certification will be formulated against recognised normative performance standards and determined by assessments which are underpinned by the actual tests completed and the specified extended application rules. The certification body will also have competent engineers with experience of the products being certificated, who will have to prove their competence to UKAS as part of their accreditation audits (as defined in ISO/IEC 17065, Conformity Assessment – requirements for bodies certifying products, processes and services).

Where third party certification and extended application guidance cannot be followed, then the process of assessment is potentially less structured and can potentially offer less assurance. What elements are therefore required to offer good assurance?

The independence and competence of the organisation completing the assessment is paramount. The use of accredited certification bodies is normally deemed to be an assurance of the independence and competence of the assessment author. Indeed Building Regulations state:

‘Tests and assessments should be carried out by organisations with the necessary expertise. For example, organisations listed as “notified bodies” in accordance with the European Construction Products Regulation or laboratories accredited by UKAS for the relevant test standard can be assumed to have the necessary expertise.’ Amendments published in December 2018 – The Building Regulations 2010

The assurance being that a certification body will:

  • Have professional indemnity insurance to cover assessment activities
  • Operate a quality assurance management system (e.g. ISO 9001)

Within such a management system, there should be defined procedures for completing assessments, as well as a matrix of competences of those completing the assessments, with justifications for the sign off of assessors/engineers ability to complete the work. This matrix will take account of experience, qualifications and CPD records, to identify their current best practice knowledge.

In addition, assessors/engineers will follow existing best practice guidance, such as the Passive Fire Protection Forum (PFPF) Guide to undertaking technical assessments of the fire performance of construction products based on fire test evidence -2019 – Industry Standard Procedure.

Good assessment reports should include:

  • Supporting evidence, such as primary test evidence in full. If permission by the test sponsor has been provided, the assessment report should include the full test reports. If the sponsor wishes to withhold the full test report, the test reference numbers must be included. Primary evidence should be the basis for the assessment judgement and not secondary evidence.
  • Secondary evidence, such as test evidence for similar systems, indicative tests or standard performance data from codes or standards. All data and evidence used to make an assessment should be fully referenced within the report.
  • The report should be specific to the product as supplied and identified by the manufacturer, including all product/range /brand names.
  • The details of the applicant (company requesting the assessment) should be included and why the assessment has been requested.
  • The assessor must highlight how they have formulated their opinion and provide a clear justification for their decision. That decision should be reviewed and signed off by another member of the assessment body.
  • The report should state the test standard, against which the assessment has been carried out.
  • A validity statement; stating the report may be superseded by primary test evidence if it becomes available. If the assessment isn’t for a specific project, the report should have a time limitation imposed – usually of around five years.

Assessment reports are an essential part of the supply chain of products with a fire performance requirement, and when done correctly should offer credible assurance of fire performance.

The Future of Fire Safety: download the eBook

Is the fire protection industry adapting to the post-Grenfell reality fast enough? At FIREX International 2019, Europe's only dedicated fire safety event, some of the world's leading fire safety experts covered this theme. This eBook covers the key insights from those discussions on the developments shaping the profession, with topics including:

  • Grenfell Inquiry must yield “bedrock change” – and soon
  • After Grenfell: Jonathan O’Neill OBE on how austerity and policy “on the hoof” are hampering progress
  • Hackitt’s Golden Thread: Fire, facilities and building safety
  • Fire safety community has to “get on board” with technological changes

Related Topics

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Aaron
Aaron
June 19, 2020 3:57 am

In terms of the person undertaking the assessment – the assessor. How do we decide on their competence to do this?

Secondly, do we require any independence from the system being assessed or are we ok with the manufacturer of said system providing the assessment?