The detail that has stuck in my mind since the last time I read about this is:
Thermodynamically speaking, if you transmit electricity from outside the Earth onto the Earth – even if you do it perfectly efficiently – you are, by definition, heating the planet.
Based on that I concluded that it is superior to generate electricity with inputs that are already hitting the Earth… But I’d be very interested to learn more about this.
The same thing happens when we create heat on earth with nuclear energy, when >50% is directly used to heat the earth via waste heat, and the electrical part becomes >99% waste heat (some tiny fraction of energy is probably permanently converted to chemical bonds, etc.)
However, this is completely dwarfed by the dynamics of flux in and out of the earth. I forget the exact order of magnitude but I think it's around 100TW, and of course about that amount needs to be rejected to space. The key dynamics that provide us balance are of course the exact and precise quantities of greenhouse gases in the atmosphere, albedo, etc.
And of course, coal and natural gas are releasing stored heat energy too, but their contributions to changes in the atmosphere far exceed the contribution from direct heat energy on the surface of the earth.
What we get from the Sun in one hour, is how much we use for the whole year.
All our energy needs can be covered if we can tap into just 0.00001% of energy received. About 100km2 worth of solar panels.
If every residential and commercial building was covered with solar roof - we’d have all our energy needs covered during day time.
It’s free energy if we know how to use it. Plants do and they make up >99.9% of biomass on this planet.
The problem with C02 is it traps the heat. All useful work is done when high energy packets from sun in ultraviolet and visible light get dissipated into heat (infrared).
If we don’t solve greenhouse gas problem, we can’t use more energy since we’d be out of equilibrium.
Plus solar panels are actually MORE reflective than forests, so if we cut down forests to place the solar panels we'd be directly helping out against climate change, by lowering the albedo of the surface.
(forests are only a little less of a "heat island" than an asphalt surface is, and that makes sense, doesn't it? Forests actively try to maximize what they capture from the sun. That tress is capture more is why trees exist in the first place)
This is false, if your SBSP conversion efficiency on the ground is better than solar panels on the ground then you will add less energy to earth by collecting the power in space and transmitting it to the ground then by changing the reflective albedo of desert to a very nonreflective material like solar panels.
Hmm. If your PV has efficiency > the albedo of the ground it covers, it will be better than SBSP, even if the rectenna is 100% efficient (assuming the emissivity of the PV is at least as good as that of the desert in the far IR). This assumes the rectenna has the same optical properties as the ground it covers.
One could improve the PV by making it highly reflective in the near IR at wavelengths below the bandgap of the cells, while still being highly emissive at the longer wavelengths where it will radiate heat.
You could put the solar panels between the sun and Earth so they only capture photon that would have otherwise hit Earth, if that really matters. This would even be nice because if we wanted to cool the planet down we could just redirect some rays.
But I think using an appreciable amount of the power provided to Earth by the sun is still sci-fi stuff anyway, so we probably can’t make an appreciable dent either way.
Sagan put us at around Kardashev .7, it is a log scale shifted by a constant, apparently we’re under .2%, of what hits the Earth, I guess.
If you set the thing up in the L1 Lagrange point it will have the same day/night problem that traditional solar has. You would need at least two, but probably realistically 3 different antenna farms to collect the power and the satellite would need to be able to track the beam across the surface of the Earth. Most of the proposals I've seen have the array in geosynchronous orbit so it can easily remain pointed at a fixed spot on the Earth.
You could also beam power from L1 to geosynchronous orbit and then to a single ground station. It really depends on how efficiently you can beam power, but hitting Type 1 on the Kardashev scale hits limitations based on radiative cooling into space vs incoming energy.
However, it makes sense to have hundreds of ground stations simply to minimize transmission losses on the ground. And presumably utilizing ground stations 24/7 is vastly less critical than maximizing the return on space based infrastructure.
One could put SBSP at the Earth-Sun L2 point, and just use ground based solar during the day. Weirdly, this would mean in winter at high latitudes one would mostly be getting solar power at night!
At the distance of the L2 point laser power beaming would probably be necessary to keep the transmitter and receiver sufficiently small.
There's no exponential growth, that's thermodynamically impossible. Heating the Earth with the microwaves would increase the Earth's radiation into space commensurately. It doesn't get stored forever. This would result in a new equilibrium almost immediately, with an unmeasurably greater temperature.
The Earth is always almost exactly in a radiative balance with space, except on pretty short timescales. If it weren't, we'd quickly cook. The radiation the Earth receives from the Sun fluctuates orders of magnitude more over solar cycles, but it's debated whether that even has a meaningful effect on global temperature.
(I'm not referring to exponential growth in temperature over time due to constant input power, but rather to exponential growth in input power over time because it's required for economic growth).
> exponential growth in input power over time because it's required for economic growth
It isn't. The amount of energy (and raw materials generally) required to produce a given amount of economic output is not constant. It gets smaller as technology advances. That offsets the effect of increased economic output. Indeed, as more and more economic output becomes information instead of physical objects, the average amount of energy required per unit of economic output will shrink even more.
Not only is this not guaranteed, it is not possible to be sustained in the long term, either physically or economically. Physically, there are bounds on the energy inputs required for any process, including information processing. Economically, if economic output increases exponentially while energy input is constant, this creates a contradiction, as it becomes exponentially easier over time for one individual to monopolize the entire energy supply.
The notion that economic growth can continue without growth of energy is a short-term illusion created by the transition to an information economy, outsourcing of manufacturing, and perhaps a lack of appreciation for the ongoing growth in energy consumption even within countries like the United States as the economy shifted away from physical goods, let alone the growth in energy consumption in countries like China that ramped up physical manufacturing to make this possible.
Per capita energy consumption has not been growing in developed countries like the US. It has been decreasing for at least a couple of decades. Total energy consumption has been increasing because of population increase, but that is expected to level off around the middle of this century.
> it becomes exponentially easier over time for one individual to monopolize the entire energy supply
No, it doesn't, because everyone else is also increasing their economic output. Assuming stable population, one individual's share of the energy supply remains constant.
And yet, the industrial manufacturing sector uses more! As does the computing sector.
This is because energy consumption reduction due to efficiency improvements goes -- to be very generous -- as 1/t, and exponential growth goes as e^t, and e^t/t is still exponential for large t.
That's a graph of CO2 concentration in the atmosphere, not human emissions. It won't look exponential until the magnitude of human emissions exceeds the magnitude of the natural carbon cycle by several times. Give it a decade or three.
If you read some criticisms of our economic system, they propose that growth will continue unabated without any limits, but after a couple hundred years, that exponential gets to the point that all the waste heat cannot be dissipated from the earth any longer and it all breaks down.
If you’re postulating unbounded growth, seems pretty arbitrary to not postulate engineering solutions to this problem (eg shift the energy intensive industry off-Earth).
Not all that energy will end as heat. And the surface that it may cover won't be significant compared with the Earth one. And if it is, it may shadow enough surface to actually cool down Earth.
In any case, it won't be worse than releasing a good amount of the captured solar energy in chemical form for millions of years in around a century.
Typical power stations (whether fueld by renewables or not) have efficiencies considerably less than 50% (the examples in this article give 40% efficiency for coal, 48% for nuclear, and 33% for geothermal).
Photovoltaics (not being Carnot-cycle heat engines) aren't subject to these specific limitations, but they have their own problems. Typical efficiencies for current mass-production models are around 20%, while "lab curiosity"-level cells still haven't broken 50%.
Note that this is just the generating side. There are also similar waste heat losses on the consumption side (for example, charging and discharging batteries is anything but 100% efficient, as anyone who's actually tried to use a modern laptop on his lap can attest).
The difference is, that 80% doesn't show up on anyone's "books." For coal, you actually need to go dig that 60% out of the ground and burn it, and it still emits a bunch of CO2.
For a fuller articulation of the point HN user thelastgallon is making, see this link:
> The difference is, that 80% doesn't show up on anyone's "books."
Well, no. You still have to pay the amortized cost of the installation and real estate, the salaries of the employees, and many other things, all of which would be less if the efficiency were higher than 20%. For example, if the photovoltaics were 40% efficient (the same as coal), you'd need only half the real estate, half the semiconductor-grade silicon, and likely half the cost of many maintenance activities, none of which are free.
I didn't say they were free, I said that the energy doesn't show up on anyone's (energy) books. You know which books I mean: they're the ones where companies & countries report their energy reserves, the amount mined/burned/imported/exported, new discoveries this year, etc etc.
The reason this matters is because there's a lazy temptation to run the Electrification Calculation merely by looking up the amount of primary fossil energy burned annually, then assuming 100% of this must be replaced by solar/wind/whatever. However this simplistic calculation will over-estimate the amount of renewable energy needed by a factor of roughly 3x.
>This site doesn't even mention solar or the efficiency thereof.
I fear you only want to 'win the debate' (vs reading for comprehension), but...
The entire thesis of the article is how mass renewable electrification enjoys large system efficiency gains over fossil fuels. It's pretty evident how solar is a critical enabler of mass renewable electrification.
My reading comprehension is just fine. You were arguing that coal had to be "dug out of the ground", unlike photovoltaic cells which are apparently dropped off on your doorstep for free by the Silicon Fairy or something.
There's the reading comprehension. Again, nobody is saying solar panels are free, and I'm not sure where you got the idea.
I'm saying that when states and corporations do their energy reporting, there's no need to report the non-absorbed ('waste') energy from PV. Sunlight striking the ground (and whether it's utilized in a way we appreciate, versus 'just' powering the weather and the water cycle) is not something we include in those numbers.
Heck, maybe we should make a home for your PV 'waste' energy, a new energy statistic that does account for all sunlight striking the Earth. So if you cut down vegetation to make a parking lot, it makes your country's energy numbers get worse. Neat! Maybe that would be useful as an additional metric, but it's far from what we're trying to measure with our existing energy reporting policies. Our existing policies emphasize the (much larger) problems of greenhouse gas emissions and local pollution impacts.
Anyway I think the point has been adequately made, cheers.
I think the more important facet is the output stage. Almost all energy consumed gets converted to heat, with only a small portion doing usable work. Unless there's a massive improvement in the electrical foundations of compute, we will be producing large amounts of heat no matter where the energy is sourced from.
unless... the compute happens also in space? Given how dirt cheap solar has become, how cheap shipping stuff to space is becoming and how little there are clouds and nights in the space making solar power production intermittent, it sounds like it might be economically feasible in not so distant future. (no, I haven't done any math on this. If it checks out, feel free to steal the idea)
The issue with large components (we're talking microns instead of 20nm), is the launch weights (coming down), and power (also coming down). Large components also mean larger silicon dies which are much more expensive, and/or fewer components per die, which means now the CPUs are on different chips and need interconnect, which increases latency and interference. Not impossible, just a load of min-max-ing to do.
You would make the stuff in space, too. Give it a gentle shove off the factory loading dock (factory is on an asteroid) and a couple of years later it shows up in earth orbit, if you get your orbital calculations right…
Not literally a gentle push, but very little rocket action is needed. The gravity well of an asteroid is tiny. The rest can be done with the correct slingshot maneuvers, the problem is calculating it. I am sure I have read something or other from NASA about it.
It's not the asteroid gravity that's the issue, it's the solar gravity field, you still have to perform an orbital transfer from the asteroid orbit to Earth orbit unless you want to leave the computer there and do batch jobs with significant latency.
And the Earth is in space, so if we get to the power consumption level where Earth governments need to care about the direct planetary heating effect of the energy source, it's still a win to do the hard thing (dissipation) somewhere else, like the Moon or something.
You also affect how much energy (heat) you radiate from earth's surface into space, by choosing the right color, materials of buildings or land around you.
We're not talking about just a one-time pulse of heat, it would be a constant stream of energy. To maintain the same radiant balance the Earth's surface would need to reach a higher equilibrium temperature.
We currently have an energy budget that's 0.1% of insolation (and compounding growth at 2-5% per year), so if SBSP actually scales to its market opportunity then the effect could certainly be large enough to matter.
Thermodynamically speaking, if you transmit electricity from outside the Earth onto the Earth – even if you do it perfectly efficiently – you are, by definition, heating the planet.
Based on that I concluded that it is superior to generate electricity with inputs that are already hitting the Earth… But I’d be very interested to learn more about this.