2021 Vol. 4, No. 5

Cover Story:Liu T, Li H, He T, Fan CZ, Yan ZJ et al. Ultra-high resolution strain sensor network assisted with an LS-SVM based hysteresis model. Opto-Electron Adv 4, 200037 (2021)

Discovering the physics and dynamics of the solid earth is essential to meet the challenges of natural hazards, energy crisis, and climate change. For geoscience data acquisition, it is required to sense the earth’s crustal deformation with high strain resolution in the frequency band from static to 100 Hz along with dense spatial coverage of sensors. Optical fiber distributed acoustic sensor (DAS) can offer a long sensing distance as well as a dynamic strain resolution down to nano-strain level by burying the fiber cable underground as the sensing elements. However, the noise induced by cross-talks from the probe laser frequency drift and environmental temperature variation severely ruin the signal to noise ratio (SNR) of the DAS in supper-low frequency band, which is essential to monitor the nature earthquake and earth tide. Recently, Professor Qizhen Sun in the School of Optical and Electric Information of Huazhong University of Science and Technology reported a high-resolution strain sensor network with large sensing capacity for geoscience research by introducing an in-suit reference fiber and real-time AI technique to compensate the low frequency noise of the DAS. The novel compensation method employs the least square support vector machine (LS-SVM) with hysteresis operators, which can reduce the thermal hysteresis between the reference fiber and sensing fiber. The protype strain sensor network with 55 sensor elements demonstrates an ultra-low frequency strain resolution of 166 pε at 0.001 Hz. The large sensing capacity along with an ultra-high quasi-static sensing resolution make the proposed strain sensor network have great potential applications in geoscience research.

cover

2024 Vol. 7, No. 11

ISSN (Print) 2096-4579
ISSN (Online) 2097-3993
CN 51-1781/TN
Editor-in-Chief:
Prof. Xiangang Luo
Executive Editor-in-Chief:
{{module.content}}