Fiber Optic Arc Flash Detection Modeling

I will start by stating that I am not 100% familiar with these types of systems [Fiber Optic Light Detecting]. In recent MV switchgear submittal the manufacturer [I assume at the recommendation of the engineer of record and or the owner] included a fiber optic light detection system for arc flash mitigation. This included “point source detection” at the terminations in addition to “linear fiber optic detection” in the breaker cubicles at the runbacks / breaker connections. In discussing with multiple manufactures of these types of systems and colleges. The way I understand is that when the fiber optic system detects the light emitted during the early stages of an event and triggers the relay trip circuit to operate the breaker. Most of the literature from manufactures indicates a lower category of PPE required due to the increased / early detection and interruption however I have not found any literature or testing that quantifies what level of light [lumens?] is required in order to operate the trip signal, and at what level of arcing current this is correlated to? So all this leads to how is this type of detection system typically modeled in the different AF software’s if there is no testing to correlate the arcing current to emitted light? Has IEEE reviewed this? How can the manufactures claim or “guarantee” two levels lower of PPE? Is it based purely on testing by the manufacturer? I understand that these types of systems are recognized by the IEC [62271-200]. Has anyone on the forum installed, started up or familiar with these types of systems I would like to get your feedback on their reliability, and if there is a industry standard [ANSI/IEEE/UL] on how these types of systems are manufactured / installed /etc.. Thanks in advance. READ MORE