SWM wrote:Cralkhi,
I see where you are coming from. You believe that progress is not inevitable, that it is possible to reach a semistable state where technology does not advance beyond a certain point.
Essentially, though I'd describe it as a state in which 'falls' and 'rises' more or less balance out -- not "stable" on the scale of human lifetimes.
And I think the possibility of us getting stuck in that state is still "live", unfortunately. If we screw up and wreck technological civilization too thoroughly (major nuclear war, biological warfare etc.), we might not get it back, especially if the collapse ended up making people afraid of technology, so that by the time people started trying to re-invent stuff the records had all rotted away.
Safehold might never have reached the stars. But that still does not mean that Langhorne was right.
Certainly not morally!
The Gbaba have apparently been around for many tens of thousands of years, at a minimum, so there is plenty of time for change to come to Safehold.
I don't think we actually know that the Gbaba starfaring "civilization" is
that old. They've been stable with no advancements for at least 2000 years, though.
(And given that we know at least 3 technological species (humanity, the Gbaba, and the one the Gbaba wiped out before humanity) originated in a very small area of the galaxy, intelligent species are likely common in the Safeholdverse... and if the Gbaba can only attack, sooner or later they will meet somebody more powerful and be destroyed.)
So there is a pretty good chance that their expected lifespan as a threat is not very long on the timescale we're discussing.
(If it wasn't for an authorial comment that we will see the Gbaba again, I'd be wondering if Safehold would get back to space and instead encounter somebody
worse who had already wiped out the Gbaba.)
--
It does seem odd to me that Merlin, Shan-wei et al. thought/think that Safehold would be essentially doomed if they reinvented space travel without knowing about the Gbaba. Safehold is much farther from Gbaba space than Earth, so a Safeholdian-origin interstellar civilization would be much bigger and older (and likely higher tech) than the Terran Federation was before they met the Gbaba.
Obviously it's not a risk you want to take, but it does seem odd.
Langhorne took it on himself to decide the fate of humanity, throwing the dice on a bet that the best minds of the Federation disagreed with. He didn't have any more information than the Federation scientists. He didn't have any special insight. He just disagreed with them. (I will leave out any claims of paranoia, phobia, or megalomania, in this analysis.)
Now, I do believe that sometimes you have to make a stand for your beliefs, even when everyone else thinks you are wrong. That is a noble thing. But the actions that Langhorne took in his belief were far from noble. He modified the memories of almost the entire surviving human race without consent. He didn't try to change their minds--he ripped it from their minds and forced them to think the way he wanted them to think. He stole from the colonists the high ideal they thought they were sacrificing themselves for, trying to seal humanity into a bottle. And when other people in Alexandria disagreed with him and made a stand for their beliefs, he struck at them with hidden weapons with no warning. Struck viciously, not merely killing them, but destroying the very continent they lived on, pounding it over and over again.
Standing up for your beliefs is noble. Forcing an entire population into a form of mental slavery to align with your beliefs is just wrong, no matter how noble your aim (and I don't really believe his motivations were that noble). And when doing so brings a very strong possibility of dooming the entire human race to destruction (in one form or another), the magnitude of the wrong is incalculable.
Oh, I most certainly agree that Langhorne was completely
morally wrong.