SBIR/STTR Award attributes
DOE identified a critical need for real-time collection and transmission of large monitoring datasets to enables real-time risk identification for carbon storage projects. We develop an adaptive low-cost automated data collection, compression, and transmission system to solve this problem. Monitoring, Verification, and Accounting (MVA) forms a significant cost component of any carbon storage campaign. Typically, a complex network of sensing technologies and data acquisition systems are deployed to continuously monitor CO2 migration during and after injection. These setups normally generate large volumes of intricate datasets that require swift transmission and affordable interpretation to enable both real-time decision-making and post-analysis. The conventional mechanisms for data transmission are often rudimentary, requiring significant time to transmit the high-resolution data to the cloud, and the processing needs significant human intervention. A low-cost MVA solution should address the data management’s bottlenecks available today including, but not limited to, insufficient bandwidth, inadequate in-situ storage and limited connectivity. Our solution to overcome these challenges is to architect a system that can: (i) significantly reduce the volume of the data captured on site, (ii) generate a reliable channel to combine and transmit compressed information to the database for further analyses, and (iii) provide streamlined protocols to standardize and expand the real- time monitoring procedures. In the Phase I study, we developed and proved feasible novel compression algorithms for automated data collection and transmission for time-lapsed seismic data. Our aims for Phase II of the study are to optimize the compression gain and reconstruction quality and push the development cost effectively towards commercialization of the final product. Firstly, we plan to collect other independent spatio-temporal big data with which we can fine-tune the developed algorithms to minimize data sensitivity and maximize reconstruction quality through algorithm assembly or composite compression. It provides the ability to measure, compress and transmit different modalities properly. Secondly, we plan to optimize an adaptive platform for edge computing comprising a cluster of low-cost solar powered compute modules, tailored to the complexity of our proposed composite compression algorithms. This platform will be affordable for performing intensive data compression on the edge, and scalable to adopt to any possible hardware scenarios on site. Finally, we plan to develop anomaly detection and notification system engines bundled in a software platform hosting the technology in the cloud. The main objective here is to standardize the real-time monitoring protocol and provide users with the utmost flexibility to analyze data using sophisticated analytics tools.