• Have you tried out dark mode?! Scroll to the bottom of any page to find a sun or moon icon to turn dark mode on or off!

diy solar

diy solar

Grid Electricity Going Up... Because of Data Centers?!

I just told you they are using combined cycle.
So you think they are 100% thermally efficient?

Even modern 2x1 (2 gas turbines, one steam turbine to capture that waste heat) combined cycle plants generate tons of waste heat.

There is an 800mw unit in my area that lost is make up water feed from the municipality, they only had 8hrs of pondage before they were coming off line, granted it was high 90s but still.

Also isn’t the proposed meta facility in LA about the size of manhattan, 2,250 acres?

Show me any area that could use that waste heat that has that much free space.

Maybe some industrial manufacturing that could be built next to it?
 
So you think they are 100% thermally efficient?

Even modern 2x1 (2 gas turbines, one steam turbine to capture that waste heat) combined cycle plants generate tons of waste heat.

There is an 800mw unit in my area that lost is make up water feed from the municipality, they only had 8hrs of pondage before they were coming off line, granted it was high 90s but still.

Also isn’t the proposed meta facility in LA about the size of manhattan, 2,250 acres?

Show me any area that could use that waste heat that has that much free space.

Maybe some industrial manufacturing that could be built next to it?
A lot is what you are willing to do. Their business is providing "data".

Every time you extract work from heat, the harder it is to extract more energy out of whatever is left, because it is lower temperature. It is pretty easy to extract work out of the heat from a gas turbine engine, because it is still extremely hot. But extracting the heat out a steam turbine is harder, because the exhaust temperature is much lower, but you can still extract plenty of work out of it.

You can run that heat through a low temperature steam turbine to capture more energy, or use it to run Stirling generators to capture a greater percentage of the heat. The heat out of THOSE can also still be used in something like district heating, or absorption chilling.

Depending on what you want to do to try to capture that heat energy, you could, in theory, come fairly close to getting to 100% efficiency. You'll never get there, but you could get close.

PROBABLY you can't get economically close to that. Are you going to locate a city near it and run steam pipes everywhere? I went to PSU, they had an on-site coal power plant, that also used the waste heat for steam heating campus in the wintertime. I'd imagine they were probably ballpark 70% or so efficient out of the input energy there in the wintertime (the way the steam lines and such were run, there was a lot of heat loss in the system, prior to actually heating buildings).

What kind of industrial processes are going to use that heat?

But if up-front cost were no object, I'd imagine you could install a combined cycle natural gas plant and then use the steam cycle waste heat for absorption chilling and probably get the total efficiency into the 70% range for utilizing input energy. Maybe even high 70% range. Long term, I'd bet that is probably also a cost savings, but Google, Tesla, MS, etc. mostly care more about upfront costs. How fast can they scale "AI" to beat their competitors. The long-term cost of operation is a footnote, especially if it delays construction and turning it on by weeks, months, or years.

Again, back to Grok and xAI's use of natural gas turbines that were NOT permitted, lied on permit applications, etc. They could care less about breaking the law if it gets them up and running sooner. Waiting months or a couple of years for grid ties, permits, etc. is not something they care about. Just like they could care less if it costs them tens or hundreds of millions of dollars more a year to operate. That is a worry for someone else in a couple of years, until then, just burn more money and hope we "win".

Heck, if they cared the most about long-term costs, they'd be pushing solar facilities with battery nearby. That is still typically cheaper than building new power plants and fueling them, even without any subsidies now. But that'd also cost TIME, possibly a few years to permit and eminent domain for transmission lines, buy/lease that much land for a GW scale solar facility.

Which is also why these facilities generally aren't doing any solar on site at all. They'd likely recover any input costs in a few years on the electrical generation, but it would delay getting them up and running by at least months. In slight fairness, I get that. If you told me it'll take me 18 months to construct a facility that was going to be generating a billion dollars a year in revenue and cost $175 million in electricity to run it, but if I added solar and batteries to the project it would save me a million dollars a year and delay the project by a month, I'd probably laugh at you and ask if maybe once it was up and running we could consider putting it in if it didn't disrupt anything. That month delay would be 80 years of revenue! Not including maybe $10 million in capital costs to install it too. Even if the capital costs would pay themselves back in a few years.

Same with things like cogeneration. Heck, that probably would save a ton more, maybe $20, $30M a month or more! But it probably would also delay construction by a few months, costing hundreds of millions of dollars in revenue. Likely also add tens of millions to the construction costs. And worse, delay "winning" by months. I might have lost because of that delay! Egads!

PS Despite how negative I am being about it from a business side, that is why Gov't exists and should bring down the hammer and say "not today". This is all socializing costs on everyone else, while privatizing the profits. This is drastically driving up costs on everyone else by slurping down power. If Gov't was stepping in and mandating best practices to reduce facility energy use, along with slowing it up to mandate that utilities increase generation this wouldn't be happening. But it would "slow" "AI" by months and months.

Oh no. In other news.
 
So you think they are 100% thermally efficient?

Even modern 2x1 (2 gas turbines, one steam turbine to capture that waste heat) combined cycle plants generate tons of waste heat.

There is an 800mw unit in my area that lost is make up water feed from the municipality, they only had 8hrs of pondage before they were coming off line, granted it was high 90s but still.

Also isn’t the proposed meta facility in LA about the size of manhattan, 2,250 acres?

Show me any area that could use that waste heat that has that much free space.

Maybe some industrial manufacturing that could be built next to it?
I didn't say combined cycle is 100% efficient. It uses all the excess heat that is practicable to capture. That 800MW unit you speak of isn't any different from any other evaporative cooling process. No water to evaporate, no cooling effect. That holds true for every conventional plant in the country. How many horsepower would you figure would be necessary to re-condense the steam from even a 30MW steam system? You know they re-use the same pure clean water for steam over and over, right?
One of the last powerhouses I helped construct was a coal fired affair and they sold its waste heat steam to a paper mill about 1/2 a mile away. What a stink that was to drive past everyday. Any usable waste heat that Entergy or any other utility adding new capacity it doesn't utilize, will very likely still be still capitalized on to the extent it is cost effective. That's what they do.
 
Tangentially related, this is a _very_ cool "Interactive Map of US Power Plants"
(Has all kinds of generation & storage details)

related, but different views:
 
Last edited:
To power the entire datacenter, I would put batteries in shipping containers that are away from the building.
how much do you want to spend on massive power cables and related equipment? Are you not going to climate control the batteries/inverter?
 

You're right, containers is a better way to go. Modular, smaller fire, the rest survives:
A 40 ft container has roughly 2350 cubic feet of space. Roughly 50% of it could be used for cells, heating/cooling, connections, etc. Presuming 1 cubic foot could hold 4 of the MB56 cells, this would equate to 4800 cells at 2 kWh each or 9600 kWh in a fully packed container. 100 containers would provide 960,000 kWh of storage capacity. Do your own math to figure out how many containers of cells would be required in the Louisiana data center site.
 
A 40 ft container has roughly 2350 cubic feet of space. Roughly 50% of it could be used for cells, heating/cooling, connections, etc. Presuming 1 cubic foot could hold 4 of the MB56 cells, this would equate to 4800 cells at 2 kWh each or 9600 kWh in a fully packed container. 100 containers would provide 960,000 kWh of storage capacity. Do your own math to figure out how many containers of cells would be required in the Louisiana data center site.
a setup like that would really be the bomb
 
Some insight on AI data center energy appetite.


We conclude that training runs in 2030 supported by a local power supply could likely involve 1 to 5 GW and reach 1e28 to 3e29 FLOP by 2030. Meanwhile, geographically distributed training runs could amass a supply of 2 to 45 GW and achieve 4 to 20 Pbps connections between data center pairs, allowing for training runs of 2e28 to 2e30 FLOP. All in all, it seems likely that training runs between 2e28 to 2e30 FLOP will be possible by 2030.

They will need to build their own CCNG power plants right next to gas drilling fields.
 
Solar itself can work well WITH data centers, because computers tend to heat their rooms the most during day time which is the time the sun is hottest.

I was actually considering spending excess solar energy that way but it's probably best done by people that computing is their job 100% of the time.
 
Solar itself can work well WITH data centers, because computers tend to heat their rooms the most during day time which is the time the sun is hottest.

I was actually considering spending excess solar energy that way but it's probably best done by people that computing is their job 100% of the time.
you can easily make something start mining crypto or something like that whenever you have surplus
 
you can easily make something start mining crypto or something like that whenever you have surplus
Crypto has on average a return of investment in 2 years or more,
so it's still best done by people who dedicate themselves to it,
they'd have to find tricks to be more efficient than others.
 
Tangentially related, this is a _very_ cool "Interactive Map of US Power Plants"
(Has all kinds of generation & storage details)

related, but different views:
Very cool interactive map! I can hover over every powerhouse I ever worked on and get the stats. All except S.O.N.G.S. They seem to have wiped it from history.
 
Crypto has on average a return of investment in 2 years or more,
so it's still best done by people who dedicate themselves to it,
they'd have to find tricks to be more efficient than others.
it's easy to make money off of mining if you're using "free" electricity, not that you'll make very much...
 
it's easy to make money off of mining if you're using "free" electricity, not that you'll make very much...
I factored in the cost of the equipment. The ROI is often years and it's always a risk,
especially if you start during a peak of the market before the prices tank for years..
 
I factored in the cost of the equipment. The ROI is often years and it's always a risk,
especially if you start during a peak of the market before the prices tank..
I'd focus on crypto that is relatively ASIC resistant so you can make decent returns/watt with computers you already have
 
I'd focus on crypto that is relatively ASIC resistant so you can make decent returns/watt with computers you already have
Yeah that's smarter. Your job may need some PCs anyway for example and you can use those.
It's still a risk though because you may damage your PC unnecessarily before it returns much..
 
Yeah that's smarter. Your job may need some PCs anyway for example and you can use those.
It's still a risk though because you may damage our PC unnecessarily before it returns much..
damaging a modern pc from mining is not a huge risk unless you're doing it 25/8 with insufficient cooling. Any CPU made in the last decade or go will generally (looking at intel recently) prevent itself from being melted. Insufficient cooling may make it run slower and wear it ever so slightly more, but it's really not a huge concern. I think a lot of that reputation comes from miners modifying GPU firmware and doing nutty overclocks which push them a bit past their limits
 
Solar itself can work well WITH data centers, because computers tend to heat their rooms the most during day time which is the time the sun is hottest.

I was actually considering spending excess solar energy that way but it's probably best done by people that computing is their job 100% of the time.
What I'd love to figure out how to do is using excess solar (if I ever get there) on other sinks.

Such as, if batteries reach full capacity, turn on the heat pump water heater until the temperature hits 140f, and use a thermostatic valve to mix down to the proper temp. If no excess solar, max temp is 118F.

Or, turn on heat or A/C with a set point 2-3F higher/lower than normal if excess solar. Just something that could be automated and not have to rely on me, say, running a load of laundry, or whatever.

I am likely to never have grid tied solar, only hybrid. Grid tie requires utility approval where I am. And also permits and inspections.

If I keep it 30amps or less at any step, no permits necessary...

I mean, if it is a couple of 25-30 amp inverters tied together.
 

diy solar

diy solar
Back
Top