Given that one of the largest problems with the data centers we’re building today is heat dissipation, that seems like an exceptionally poor choice. Space creates major problems for heat dissipation.
Yeah. Honestly, I’m having a hard time thinking of any substantial benefits. Eventually, okay, sure, there’s a point in time where we can’t create computer structures on Earth if we’re going to scale up, but that is way the hell out there on the list of constraints we have. I also kind of suspect that materials science and manufacturing and computing technologies may change a lot and obsolete anything we create now long before that.
The article has:
“Starcloud’s mission is to move cloud computing closer to where data is generated,” Starcloud CEO Philip Johnston said in a statement.
But most data isn’t generated in space. It’s generated on Earth. Maybe if you have some kind of Earth-observation satellite in low earth orbit and want to add a shit-ton more processing capability to it so you don’t have to send its data back down to datacenters on Earth to chew on? Sounds kind of Orwellian, but maybe I could see that. But it seems like such a niche case.
You could put that on the dark side of the moon for example. Don’t get much colder than that. Radiating the heat away is no problem is you don’t take up much in the first place. The question is where energy comes from. I’m guessing they are going for nuclear energy for that. Sounds doable l, but why would you want to?!
The moon is tidally locked to the earth, not the sun. i.e., “the dark side” is the side that is never exposed to Earth, but it has a regular “day” cycle in the form of lunar phases. And dissipation would still be a problem, because you don’t have air to dump the heat that computers generate to
Radiative cooling. You can still have a data center in a stationary orbit between the earth and the moon thereby shielding it from the sun most of the time. But you are right, much harder problem. Gotta think on that some.
Given that one of the largest problems with the data centers we’re building today is heat dissipation, that seems like an exceptionally poor choice. Space creates major problems for heat dissipation.
The high radiation environment and the challenge of doing common maintenance tasks (e.g. disk replacement) seem prohibitively difficult as well…
Yeah. Honestly, I’m having a hard time thinking of any substantial benefits. Eventually, okay, sure, there’s a point in time where we can’t create computer structures on Earth if we’re going to scale up, but that is way the hell out there on the list of constraints we have. I also kind of suspect that materials science and manufacturing and computing technologies may change a lot and obsolete anything we create now long before that.
The article has:
But most data isn’t generated in space. It’s generated on Earth. Maybe if you have some kind of Earth-observation satellite in low earth orbit and want to add a shit-ton more processing capability to it so you don’t have to send its data back down to datacenters on Earth to chew on? Sounds kind of Orwellian, but maybe I could see that. But it seems like such a niche case.
You could put that on the dark side of the moon for example. Don’t get much colder than that. Radiating the heat away is no problem is you don’t take up much in the first place. The question is where energy comes from. I’m guessing they are going for nuclear energy for that. Sounds doable l, but why would you want to?!
The moon is tidally locked to the earth, not the sun. i.e., “the dark side” is the side that is never exposed to Earth, but it has a regular “day” cycle in the form of lunar phases. And dissipation would still be a problem, because you don’t have air to dump the heat that computers generate to
Radiative cooling. You can still have a data center in a stationary orbit between the earth and the moon thereby shielding it from the sun most of the time. But you are right, much harder problem. Gotta think on that some.