New research articulates the profound differences between carbon capture technologies, outlining the economic and environmental implications of the different systems. Scientists and engineers have invented these technologies to help mitigate climate change. One that you may have read about before is “Carbon Capture and Storage” (CCS), an innovation that captures existing carbon and stores it underground. Today, scientists are adapting CCS into a more profitable system named “Carbon Capture and Utilization” (CCU).
So what exactly is CCS?
Carbon Capture and Storage catches CO2 waste from significant point sources like cement factories or fossil fuel power stations, then deposits the carbon into underground geological formations, stopping CO2 from entering the atmosphere. The technologies used in the CCS process include absorption, adsorption, chemical looping, membrane gas separation and gas hydrate. By storing carbon emissions, the CCS system would be tackling both global warming and ocean acidification. The main issue with this system is its unprofitability combined with highly expensive machinery. Due to this economic issue, CCS is barely in use: currently there are only 17 operating CCS systems around the world.
On the other hand, scientists and engineers came together and created the CCU system (Carbon Capturing Utilization). The difference with this system is that instead of carbon being stored underground, the carbon is upcycled into chemicals, diamonds and fuels. Through this profitable system, carbon can become a renewable source that supplies a demand in a less polluting manner. Although it is not as sustainable as it sounds. Its carbon-capturing technology requires as many joules of energy as those produced by the burning of fossil fuels.
Fun fact: Did you know that earlier this year Microsoft pledged to become carbon negative: ie to capture all the carbon it has ever emitted.