clean energy

A tech-powered approach to overcoming grid bottlenecks

Transmission lines outside Houston, Texas (Courtesy: BFS Man/Flickr)

Contributed by Grzegorz Marecki, co-founder and CEO of Continuum Industries
View the original article here

The expansion of electricity transmission infrastructure is crucial for meeting growing energy demands and accelerating the United States’ clean energy transition. However, spatial planning processes are struggling to keep pace with the speed of change required to tackle climate change. Building clean energy infrastructure today can take more than a decade, largely due to delays in planning and permitting. 

The root of this challenge lies in the complexity of infrastructure planning. Developers need to simultaneously meet the requirements of dozens of stakeholders, which demands a balance between technical and regulatory considerations, as well as the perspectives, priorities, and concerns of diverse stakeholders. Utilities, traditionally functioning as asset managers, are now faced with the need to become developers, driving the rapid expansion of America’s grid demands. However, their processes have not evolved at the pace required to meet this urgent need.

Additionally, as the volume of work increases, the industry grapples with insufficient resources to deliver at the speed required. Automation of repetitive tasks can help free up professionals, enabling them to focus on more complex challenges. The industry also faces constraints due to the limited number of specialists available for traditional tasks.

To overcome these challenges, the industry must embrace a technology-powered paradigm shift. A tech-enabled approach to planning processes, supported by professional oversight, has the potential to revolutionize the development of new energy networks.

Frontloading data for more predictable permitting

With the advancement of technology, governments and other key stakeholders now have access to unprecedented amounts of data. If harnessed effectively, this data can help infrastructure developers expedite decision-making processes and streamline the planning and permitting phases. 

New tools allow for a comprehensive data dive right at the project’s start, providing a full picture of constraints and opportunities. AI algorithms offer intelligent insights, guiding developers toward optimal decisions. These tools empower users by allowing them to configure assumptions, preferences, and project goals before the algorithm runs, ensuring a clear link between inputs and outputs. For example, easier access to spatial data makes it possible for developers to get a comprehensive view of the permits that would be required. Traditionally, they would have to wait a few months for a manually produced report from a consultant. This also allows the professionals to focus on the areas of highest risk. 

Automated routing for unbiased solutions 

By automating routine processes, developers can assess more alternatives than was ever possible without automation and remove decision biases. Algorithms can explore and optimize different solutions for infrastructure assets, simultaneously considering factors such as cost, technical feasibility, and environmental and community impact. 

While still in its early stages of adoption in the US, automated routing is already demonstrating its potential, and it might pay to look across the pond for an example to follow. The UK is slightly ahead of the US when it comes to grid expansion, and more than just encouraging is now expecting transmission companies to standardize and automate routing. The government has adopted a package of 19 measures to slash the project development timeline from 14 to 7 years, but utilities that have adopted a heavily automated approach say that they’ve been able to kick-start their projects and complete 12 months’ worth of work in as little as 8 weeks.

As the energy sector transforms and projects become increasingly complex, automated tools will empower developers to respond swiftly to changing requirements, market dynamics, and regulatory landscapes.

Transparency and accountability through comprehensive decision-making

In the past, manual record-keeping and documentation processes left room for ambiguity and potential oversights. However, technology can establish a reliable audit trail, ensuring that every decision is logged, timestamped, and linked to the specific dataset or analysis that influenced it. This record enables project teams to revisit and refine decisions, supporting external regulatory approvals and fostering accountability throughout the process.

Breaking silos with cross-department collaboration

Technology can also bridge the divide between internal teams, fostering smooth collaboration and streamlining decision-making processes. Providing a shared platform for data and insights ensures that everyone involved in the project is working from the same information, reducing miscommunication and delays.

Through leveraging technology, traditionally siloed disciplines can replace slow email communication channels with rapid feedback based on standard criteria. For example, when engineers move a tower to avoid difficult ground conditions closer to a water body, they receive automatic feedback on whether the new position meets the requirements for setbacks from the water based on environmental protection policies.

Public engagement and transparent project narratives

Beyond internal teams, technology plays a crucial role in engaging the public and garnering support for critical infrastructure projects. Presenting decisions backed by solid evidence and interactive visualizations makes complex data digestible and addresses concerns head-on. This transparency builds trust and fosters a sense of shared ownership, crucial for navigating the permitting process and ensuring community buy-in. 

Stakeholder engagement is also changing thanks to technology. Dynamic maps and immersive 3D visualizations now allow project teams to collaboratively iterate with stakeholders, demonstrating the project’s evolution over time and minimizing impacts. This interactive approach, coupled with routing and siting automation, eliminates the traditional time constraints associated with manual rerouting.

The submission of documents, particularly environmental baseline schedules and reports, has also evolved. Algorithms can now identify potential impacts, presenting them to professionals for screening and defining mitigation strategies. This not only speeds up the process but also frees up professionals for more strategic tasks.

The Linear Infrastructure Planning Panel provides a noteworthy example of the efforts made towards more transparent and dynamic project planning. The Panel’s purpose is to engage key public interest stakeholders, including social and environmental groups, in the development of good practices and ethical approaches in the use of new techniques, such as algorithms and advanced software tools, for infrastructure planning. By actively involving various stakeholders, the Panel contributes to shaping responsible and inclusive technology integration in the planning process, setting a precedent for the industry.

Looking ahead

The American Council on Renewable Energy emphasizes that a $1.5 trillion investment in new transmission infrastructure by 2030 is not just a financial commitment, but an investment in a clean energy future. In this landscape, technology is not a silver bullet, but still a powerful catalyst for change. It facilitates efficiency, transparency, and collaboration, enabling informed decision-making and accelerating the development of a robust and resilient grid.

By embracing a tech-powered approach, the US can overcome the bottlenecks plaguing the current system and realize the full potential of clean energy. This transformation isn’t about replacing human expertise; it’s about empowering and augmenting it, fostering a synergy that paves the way for a more efficient and sustainable future.

Major breakthrough in pursuit of nuclear fusion unveiled by US scientists

By: Tereza Pultarova
View the original article here

A nuclear fusion experiment produced more energy than it consumed.

Scientists at the Lawrence Livermore National Laboratory in California briefly ignited nuclear fusion using powerful lasers. (Image credit: Lawrence Livermore National Laboratory)

American researchers have achieved a major breakthrough paving the way toward nuclear fusion based energy generation, but major hurdles remain.

Nuclear fusion is an energy-generating reaction that fuses simple atomic nuclei into more complex ones, such as combining atoms of hydrogen into helium. Nuclear fusion takes place in the cores of stars when vast amounts of molecular dust collapse under gravity and create immense amounts of pressure and heat in the nascent stars’ cores. 

For decades, scientists have therefore been chasing nuclear fusion as a holy grail of sustainable energy generation, but have fallen short of achieving it. However, a team from the Lawrence Livermore National Laboratory (LLNL) in California may have finally made a major leap to creating energy-giving ‘stars’ inside reactors here on Earth. 

A team from LLNL has reportedly managed to achieve fusion ignition at the National Ignition Facility (NIF), according to a statement published Tuesday (Dec. 13). “On Dec. 5, a team at LLNL’s National Ignition Facility (NIF) conducted the first controlled fusion experiment in history to reach this milestone, also known as scientific energy breakeven, meaning it produced more energy from fusion than the laser energy used to drive it,” the statement reads.

The experiment involved bombarding a pencil-eraser-sized pellet of fuel with 192 lasers, causing the pellet to then release more energy than the lasers blasted it with. “LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE),” LLNL’s statement reads. 

Still, that doesn’t mean that fusion power is within grasp, LLNL cautions. “Many advanced science and technology developments are still needed to achieve simple, affordable IFE to power homes and businesses, and [the U.S. Department of Energy] is currently restarting a broad-based, coordinated IFE program in the United States. Combined with private-sector investment, there is a lot of momentum to drive rapid progress toward fusion commercialization,” the statement continues.

Even though this is only a preliminary step towards harnessing fusion power for clean energy, LLNL leaders are hailing the accomplishment as a transformative breakthrough. “Ignition is a first step, a truly monumental one that sets the stage for a transformational decade in high-energy density science and fusion research and I cannot wait to see where it takes us,” said LLNL Director Dr. Kim Budil during Tuesday’s press conference.

“The science and technology challenges on the path to fusion energy are daunting. But making the seemingly impossible possible is when we’re at our very best,” Budil added.”

Such conditions lead up to the ignition of the fusion reaction, which, however, in the current experiment was sustained for only a very short period of time. During the experiment, the energy generated by the fusing atoms surpassed the amount of energy required by the lasers igniting the reaction, a milestone known as net energy gain.

Scientists at the laboratory have conducted several fusion experiments in recent years, which haven’t generated the amount of power needed to claim a major breakthrough. In 2014, the team produced about as much energy as a 60-watt light bulb consumes in five minutes. Last year, they managed to reach a power output of 10 quadrillion watts of power  —  which was about 70% as much energy as consumed by the experiment.

The fact that the latest experiment produced a little more energy than it consumed means that for a brief moment, the reaction must have been able to sustain itself, using its own energy to fuse further hydrogen atoms instead of relying on the heat from the lasers. 

However, the experiment only produced 0.4MJ of net energy gain — or about as much is needed to boil a kettle of water, according to the Guardian.

The breakthrough comes as the world struggles with a global energy crisis caused by Russia’s war against Ukraine while also  striving to find new ways to sustainably cover its energy needs without burning fossil fuels. Fusion energy is not only free from carbon emissions but also from potentially dangerous radioactive waste, which is a dreaded byproduct of nuclear fission. 

The New York Times, however, cautions that while promising, the experiment is only the very first step in a still long journey toward the practical use of nuclear fusion. Lasers efficient enough to launch and sustain nuclear fusion on an industrial scale have not yet been developed, nor has the technology needed to convert the energy released by the reaction into electricity.

The National Ignition Facility, which primarily conducts experiments that enable nuclear weapons testing without actual nuclear explosions, used a fringe method for triggering the fusion reaction.

Most attempts at igniting nuclear fusion involve special reactors known as tokamaks, which are ring-shaped devices holding hydrogen gas. The hydrogen gas inside the tokamak is heated until its electrons split from the atomic nuclei, producing plasma. 

The lasers heated up the cylinder to a temperature of about 5.4 million degrees Fahrenheit, which vaporized the cylinder, producing a burst of X-rays. These X-rays then heated up a small pellet of frozen deuterium and tritium, which are two isotopes of hydrogen. As the core of the pellet heated up, the hydrogen atoms fused into helium in the first glimmer of nuclear fusion.