In today's Finshots we talk about how a power source that was once considered a vanity object suddenly came to dominate the energy mix
One of the most radical things propelling the rise of renewables right now is the proliferation of solar cells in the energy ecosystem. Only a few decades ago, solar energy was an insignificant component in the energy mix. Today, however, it’s competing with fossil fuels on an almost even footing. This begs the question — What really changed for solar? And how did we get here?
Let’s go back to the 1950s, at a time when the price of photovoltaic cells (solar cells) was extremely prohibitive. If you had to install a single solar panel — the kind you’d put on your roofs today, it would cost you a whopping ~$600,000, after adjusting for inflation. That is a ludicrous sum of money to pay for any kind of energy, let alone solar. So it was a practically useless alternative.
But then, things started changing. At first, progress was modest. But solar cells found their use — in calculators, watches, and space vehicles. Soon, companies were making large strides in improving cell efficiency. Every square inch could now capture and supply more energy.
They were making wafers thinner, optimizing the structure of cells, reducing the use of expensive ingredients like silver paste, and learning on the go. Governments too joined the bandwagon. They invested in research and development initiatives that made better materials, improved manufacturing processes, and made solar cells/modules more cost-competitive.
Bottom line — Small incremental changes across the board helped drive down costs.
But there was something else happening in the background. Governments began incentivizing the use of solar over the past few decades. They offered grants and subsidies. They were bearing the cost alongside you and it changed the whole landscape. For starters, it made PV cells more affordable, and solar became a viable alternative for some people. But more importantly, as demand for solar cells started booming, it helped manufacturers leverage economies of scale. Think of it this way — Manufacturing solar cells is an extremely tedious affair. The amount of money you’d have to shell out to set up the factories — It’s ridiculous. But once you get the ball rolling, the incremental cost of adding new capacity isn’t all that high. So the more cells you produce, the better your chances of churning out a profit. And as you keep churning out more money, you can invest in expanding the capacity some more. This in turn breeds a virtuous cycle that keeps benefiting customers.
As companies try and battle it out in a free market, they get more ambitious. And as they learn how to set up massive factories, they’ll also learn how to optimize the manufacturing process. They’ll no longer produce as many faulty cells as they used to. They’ll figure out how automation can reduce costs. They’ll invest in R&D themselves and improve cell efficiency. All in all, they’ll be responsible for making solar energy more viable, alongside the government. And sure, all this happened because governments in developed countries (and China) decided to subsidize solar power. But you’ve got to give the private sector some props as well.
Finally, there’s one last thing — Fossil fuels. Remember, all costs are relative. If the price of solar energy drops, you better hope the price of non-renewable energy isn’t tanking alongside it. Because then the cost advantage would simply evaporate. Luckily for us, that didn’t happen. For instance, while solar energy became 90% cheaper, the price of electricity from coal declined by merely 2%. It’s because dirty fuel was already cheap to begin. There wasn’t enough scope to eke out inefficiencies. More importantly, governments made it a policy objective to treat non-renewable sources of energy with contempt. They deliberately made it more expensive to use these polluting variants and as a result, relative cost benefits fully accrued to the likes of solar.