Apples to Apples

In our last two posts, we discussed the fact that OQ qualifications were always intended to be operator specific and that the criteria used to conduct OQ evaluations was always intended to be linked to the operator’s work processes and O&M procedures. We delved into several reasons why our industry may have evolved away from those principles in favor of off-the-shelf training/materials developed by third parties. Those reasons include:

  1. Silo organizations
  2. Variability vs Logistics
  3. The culture of third-party service providers
  4. Enforcement actions inconsistent with enforcement guidance

In this post, we’ll explain one of the outcomes of this “evolution” that has not served our industry particularly well. We’ll look at the issue of “OQ qualification equivalence” (or “portability”) and we’ll focus on this issue with a specific eye toward the impact this has had on the qualification of contractor personnel.

Equivalence

There are countless OQ matrices floating out there designed to show equivalence of covered tasks and/or qualifications. You can find matrices that compare operator to operator, operator to third party, third party to third party, industry standard to industry standard and so on. These matrices are used to determine which OQ qualifications can be “accepted” that may have been put in place through other organizations and/or processes.

Operators utilize these matrices for various reasons.

  1. Operator-to-operator matrices are used in the case of mergers/acquisitions and/or onboarding personnel hired from another operator;
  2. These matrices are used in regulatory audits to justify the granting of qualifications that may have been put in place through a different program/approach; and
  3. These matrices are most commonly used in the interest of logistics (saving time and money).

The risk is that, in the absence of due diligence, we may be abandoning the principle of “operator specific” and increases the likelihood of non-equivalent qualifications of individuals performing identical covered tasks.

In too many cases, these matrices are constructed by looking only at the title of the covered task, or the regulatory citation underlying the covered task, or both. One must get into the weeds to truly determine whether a covered task/qualification is indeed “equivalent” with another. True equivalency cannot be measured by the title of the task. It must be measured through comparison of knowledge, skills, abilities, and abnormal operating conditions defined within the evaluation criteria for the covered tasks in question.

There are many examples of two things which share a name or title, but which are significantly different from one another. For example:

  1. Apple
    a. A type of fruit
    b. A multinational technology company
  2. Phoenix
    a. A mythical bird that rises from its ashes in folklore
    b. A city in Arizona, USA
  3. Club
    a. A group or association
    b. A type of weapon
    c. A suit in a deck of playing cards

Same title but significant differences in definition. While those may be silly examples, they illustrate very well that name/title alone is not an adequate definition. We deal with this in OQ every day.

OQ-Related Issues

  1. When an operator modifies a covered task, or the evaluation criteria underlying that covered task, the following question is unavoidable.

Must I re-evaluate the people qualified to the old version or is it “equivalent” to an extent that allows me to continue to deem the person qualified until expiration of the current qualification?”

Properly answering that question demands that we get “in the weeds” – even if the title of the covered task is unchanged!  Operators usually deal with this question in the context of their Management of Change (MOC) process by judging the “significance” of the change.

  1. When an operator allows its contractors to qualify their personnel through a process that is different than the one used for its internal personnel, the question of “equivalence” is unavoidable. In those cases where an operator gives their contractors a choice of processes (i.e., multiple third-party service providers), the complexity of the “equivalence” question becomes significantly greater. Commonly, operators go no further than the title of the covered task to make these determinations of “equivalence”.

This is broken and in need of repair.

Making the Case

The tables below illustrate why determining equivalence based on the title of a covered task is a problem. The examples below provide a comparison for three “widely applicable” covered tasks which apply to many operators and/or their contractors. There is an obvious contrast between the examples in all three cases. Clear differences exist when comparing regulatory citations, knowledge criteria, skills criteria, and AOCs.

Cathodic Protection Survey
Example #1Example #2
Regulatory References
49 CFR 19249 CFR 19549 CFR 19249 CFR 195
192.457(a)195.561192.328(e)195.573(a)
192.465(a, c)195.563192.465(a) 
192.467(d)195.573192.465(e) 
192.481195.575192.620(d)(6) 
192.491195.577  
 195.583  
 195.589  
Knowledge Criteria
Define and understand the function of conducting annual cathodic protection surveysExplain what is required prior to performing task
Describe how to measure structure-to-electrolyte potentialIdentify types of CP surveys
Describe how to perform close interval surveyIdentify and describe the test equipment used to complete a cathodic protection survey
Describe how to perform interference testingIdentify and describe measurements that may be required at a given test point:
Describe how to test for electrical isolationDescribe use of interrupters
Describe how to inspect and electrically test bondsIdentify common methods of determining IR drop
Describe how to conduct visual atmospheric inspection 
Skills Criteria
Measure structure-to-electrolyte potentialDemonstrate how to perform test equipment check to verify that equipment functions within specified parameters
Perform close interval surveyDemonstrate how to perform each of (1) Test point survey and (2) AC Potential survey
Conduct AC current attenuation survey 
Conduct AC voltage gradient survey 
Conduct DC voltage gradient survey 
Perform interference testing 
Test for electrical isolation 
Inspect and electrically test bonds 
Conduct visual atmospheric inspection 
Abnormal Operating Conditions
Component malfunction or failureUnintentional release, vapors, or hazardous atmosphere
Coating damageMaterial defects, anomalies, or physical damage of pipe
Physical damageFailure or malfunction of pipeline component(s)
Corrosion damage 
Cathodic protection reading outside of expected ranges 
Unexpected hazardous product 
Fire or explosion 
Provide Security for Pipeline Facilities
Example #1Example #2
Regulatory References
49 CFR 19249 CFR 19549 CFR 19249 CFR 195
NA195.436192.317195.434
  192.705195.436
    
    
Knowledge Criteria
Explain what is required prior to performing taskDescribe how to provide security for pipeline facilities including (a) protection from vandalism and unauthorized entry, (b) methods used to secure facilities, (c) examples of remote security devices, (d) important elements for providing a secure facility, (e) intrusion devices, (f) remote monitoring, (g) routine inspections, and (h) documentation
Describe steps to follow in providing protection for (a) pipeline, (b) pump stations, (c) breakout tank areas, and (d) other exposed facility equipment 
Skills Criteria
NAProvide security for pipeline facilities
Abnormal Operating Conditions
Unintentional release, vapors, or hazardous atmosphereVandalism or other breach of security
Unintended fire and/or explosion on or near the pipelineUnexpected hazardous product
Evidence of sabotage or criminal activity 
Leakage Survey
Example #1Example #2
Regulatory References
49 CFR 19249 CFR 19549 CFR 19249 CFR 195
192.706NA192.706NA
Knowledge Criteria
Define and demonstrate a working knowledge of leakage surveys and the equipment used to perform themExplain what is required prior to performing task
Describe how to perform visual surveys for pipeline leakageDescribe means of identifying leaks
Describe how to perform pipeline leakage survey using a combustible gas detectorDescribe how to test casing vents with gas detector
Describe how to perform pipeline leakage survey using a flame ionization detector 
Skills Criteria
Perform visual surveys for pipeline leakageDemonstrate use of the leak detection device according to manufacturer’s guidelines
Perform pipeline leakage survey using a combustible gas detector 
Perform pipeline leakage survey using a flame ionization detector 
Perform pipeline leakage survey using an optical methane detector 
Perform pipeline leakage survey using a remote methane leak detector 
Abnormal Operating Conditions
Exposed pipe segments, damaged or missing supports for above ground piping, missing or damaged signs, or unsupervised third-party construction Unintentional release, vapors, or hazardous atmosphere
Ignition of released hydrocarbon Unintended fire and/or explosion on or near the pipeline
Leakage around stem of mainline block valvePhysical damage of pipe or a component that has impaired or is likely to impair the serviceability of the pipeline
 Failure or malfunction of pipeline component(s)
 Unreported encroachment activities

The purpose of showing these examples is not to pass judgment on the quality or accuracy of either. The purpose is simply to show the inconsistencies between the two examples for each covered task and to reinforce the point that there is a real danger in reviewing only the title of the task to determine equivalence.

We should assume we are not dealing with apples to apples and understand that whatever the title of a covered task may be, that alone is not indicative of equivalence to another covered task. It may be that the Pareto principle applies here – that is, we might assume the 80/20 rule is in play and it may mean that the two tasks in question are 80% equivalent. I would submit that 80% is not adequate if the goal is to minimize risk and achieve real compliance with OQ.

Another issue that needs to be addressed (we’ll tackle this one soon) is the equivalency of evaluations. Even if two operators had identical OQ programs and each qualification was based on exactly the same evaluation criteria, there remains a question of equivalency based on (1) the evaluator who conducted the evaluation and (2) the methods and process behind the evaluation. For example:

  1. The evaluator who’s considered an SME by one operator may not be nearly as knowledgeable or experienced in the eyes of another operator;
  2. The evaluation methods used by one evaluator may not be the same as those used by the other; and
  3. The training each evaluator received may differ.

The most important comparison remains the one with the operator’s own O&M procedures. In the end, that is the biggest question…. “Are the qualifications I accept consistent with my actual work processes and O&M procedures?”. There may be other factors which affect the question of operator-specific but which may be outside of the O&M procedures. How one performs a given task may be related to:

  1. Age/vintage of the system
  2. Environments in which the pipeline is operated
  3. Types of pipe coating used
  4. Types of materials and equipment used; and
  5. Operating history

These characteristics are also directly related to the principle of “operator specific”.

    What We’re Doing Right

    It has been pointed out to me by someone who I hold in high regard that, despite several things being “broken” with our industry’s approach to OQ, there is much that is being done well and much that we continue to get better at. That is exactly right!

    Here are some of the areas in which we are performing well:

    1. Our industry continues to talk about and identify problems that will help us mitigate shortfalls in our collective OQ efforts.
    2. Work is continually being done to improve and enhance industry standards such as API RP 1161 and ASME B31Q.
    3. Advancements in technology are making us better every day at managing assignments, qualifications, interacting with contractors, managing change, and communicating (both internally and externally).
    4. API RP 1173 (PSMS) is gaining more and more momentum in our industry – that will help us integrate our OQ efforts into a broader, more holistic management system. This promises to help our industry better manage risk.

        Stay Tuned

        Systemic Compliance is hard at work developing a new approach. We will soon offer help to operators in linking training content and task evaluation criteria to THEIR internal procedures and will provide a smarter approach to determining equivalence. We will honor the variability that was designed into the OQ Rule. And we will deliver this new approach in a way that complements a management system framework (PSMS) and is both time- and cost-effective (logistics).

        In the next post, we’ll move to a different “broken” issue. Please stay tuned.

        Contact us today to discuss how Systemic Compliance can help. We’re confident we can be a valuable and cost-effective partner in designing, implementing, and evaluating the effectiveness of your OQ Program.

        COMPLIANCE SHOULD BE SYSTEMIC!


        Discover more from Systemic Compliance

        Subscribe to get the latest posts sent to your email.

        Discover more from Systemic Compliance

        Subscribe now to keep reading and get access to the full archive.

        Continue reading