I'm a bit disappointed by this week's Isaac Arthur video about hive minds not going in depth on this question so I'm...

I'm a bit disappointed by this week's Isaac Arthur video about hive minds not going in depth on this question so I'm asking here.
Is there a compelling hard-science reason why a hive mind composed entirely of humans would lose the "fundamentally human qualities" of its members such as (but not limited to) empathy for other humans and a desire for entertainment?
Sure, the idea is that enhanced intelligence leads to more "rational" behavior (by the standards of the author who is only human IRL), but isn't it also possible that everyone would just be "irrational" together?

Other urls found in this thread:

orionsarm.com/eg-article/486028564241d
twitter.com/AnonBabble

I guess the idea goes that a bunch of people who join into a hivemind are more likely to value the collective over the individual, maybe to the point where individual human life is no longer important to them.

This in part. Taken even further, in reality a true hive mind would have no individuals at all. Anything one experiences they all experience. More, there can be no secrets or personal thoughts. Everything about a person is instantly known, weighed and judged, and likely purged if the mass consciousness decides it's of no value to the hive. A true hive mind is one mind, period.

As to empathy... yes and no? It wouldn;t feel any towards individuals of the hive. That would be like having empathy for your toenail clippings. If something were wrong with an individual and it couldn't be easily fixed then the most rational way to deal with the problem would be to excise the individual. Corpses cause no harm. (This is assuming a large hive, of course. For smaller ones you would likely try to repair the damage, be it physical or psychological, since you can't afford to waist any parts of the whole. Empathy is still the wrong term, though. You don't empathize with your broken arm, you just go to a doctor to get it set.)
Any empathy the hive might feel would instead be for people and creatures outside of itself, like the way you or I can empathize with our dog or sibling.

As for entertainment, why not? It wouldn't be creating TV shows or the like for itself since what would be the point? It would already know everything about any given episode before it aired. Music, painting, sculpture and the like, though- things that one can take pleasure in the act of doing and not just observing- those would still be viable. And, of course outside sources of entertainment would still likely be welcome unless the merging of the minds has so homogenized the individuals that they become emotionally null.

I like how it was phrased in the Ender's Game series. With a true hive-mind, you end up with the Formics, who have a single collective mind which guides all individuals. You stop having a bunch of people dynamically interacting and bouncing off of each other and start having a single individual with a bunch of separate bodies. This was one of the reasons initial contact with the Formics went so bad; they assumed humans were the same, and that killing some humans was akin to a poke "hello".

Now, such an entity can certainly be irrational, and still desire things like entertainment. However, such things would be carried out on the scale of the hive-mind, instead of individuals. A simple example would be having one body watch TV while the rest work, as the workers still experience the TV.

A less extreme setup would be something like the Protoss Kahla, which is sort of an opt-in hivemind. They can passively read minds and emotions of one another, but this really just aids in standard communication and increases empathy. The hive-mindy bit is where they can meditate to connect to the Kahla and sense all others similarly connected, sharing information and such. This ends up working kind of like a psychic internet. Good for cohesion, but not so extreme as to cause everyone to devolve into a singular being.

Note that this latter example was still extreme enough to cause a rebel splinter-group to disfigure themselves so that they couldn't access or be accessed by the Kahla, because they were so scared of it.

And I can't say I blame them. The thought of having my mental privacy taken away from me would drive me to extreme acts to stop it, too.

Anything not exactly like us is THEM and therefore bad and inhuman.

We don't really know what fundamentally human qualities actually are, and to the extent that we do, we're kinda handwavey on why we have them (something something selfish gene, something something mirror neurons).

So the idea that we know what a pile of brains wired together to work as one would be is kind of farcical.

Also the idea that more intelligent = more rational should read a biography of, oh, any significant mathematician.

Georg Cantor proved there are multiple kinds of infinity. It's hard to think vaguely about infinity. This guy did math with it. Also: Starved himself to death because he only trusted his wife not to poison his food and she got sick.

Collective minds fascinate me, and it's fun to play what if, but the idea that you can say "they'll be like X and not like Y for reason Z" is gibbering lunacy.

>would lose the "fundamentally human qualities" of its members such as (but not limited to) empathy for other humans and a desire for entertainment?
They would actually most likely be expanded. When you can literally feel other beings feelings, it's a bit harder to be a raging sociopath.

>A less extreme setup would be something like the Protoss Kahla, which is sort of an opt-in hivemind. They can passively read minds and emotions of one another, but this really just aids in standard communication and increases empathy. The hive-mindy bit is where they can meditate to connect to the Kahla and sense all others similarly connected, sharing information and such. This ends up working kind of like a psychic internet. Good for cohesion, but not so extreme as to cause everyone to devolve into a singular being.

The Conjoiners in Revelation Space have a similar thing using cybernetics. They're also much less aggressive compared to the traditional sci-fi hivemind.

More than Human by Theodore Sturgeon addresses this in detail.

Isn't a human hivemind just one person residing in multiple minds? It's not The Hive, it's just Bob. Bob isn't any smarter or faster than any other human, in fact depending on how he gets information between member bodies he may be slower. He'll remember more though.

Wasn't that less a hivemind than just the only practical way to do true all-in Democracy? A permanent brain implant that shows you today's policy decisions to vote for and you get so used to voting how you like (and sneaky software probably 'helps) that you vote reflexively, eventually making voting so fast it happens autonomously and they can just flick out tiny questions by the thousands because people will vote within a minute.

No, that's the Demarchists (democratic anarchists). The Conjoiners have a neural network that links them all together into a kind of soft hivemind, where each member retains their individuality but is also part of a bigger gestalt mind.

Yes, it was. Now I remember. Knew the groups but forgot the names even though the names literally describe what they are.

It was a more interesting hivemind since it was explicitly individualist. Every human is themselves with a vastly expanded thought, and a limitless pool of aware knowledge. If anything, the expanded sphere of thought was more liable to make them not human than the hive.

>All transhumanist ideas are completely rational, and anyone who criticizes them is obviously a simple-minded throwback who smells funny

From your point of view, you don't join the hive, the hive joins you. You don't lose any of your memories or priorities, you simply gain a lot of others. Your identity doesn't disappear, it expands.

Basically, it would be like ego death - becoming one with the universe - on a lesser scale, and that doesn't really seem to result in devaluing human life.

>Taken even further, in reality a true hive mind would have no individuals at all.
We actually have plenty of examples of hiveminds: all multicellular life. Every single one of your cells is a living creature in their own right, and has an individual identity - a cell-sized identity, but one nonetheless. There are, of course, other levels of organization in your body, various organs and subsystems, all seeing to their tasks as independently as is practical.

What I'm getting to is that a true hive mind, such as yourself, requires structure, and as its scale grows at some point it would be most efficient to just have these structural parts be fully sapient themselves. An interstellar hivemind empire, for example, would require subsystems for production, logistics, science, military, diplomacy, and so on. As each of these tasks have different requirements, the subsystems would also need to have different patterns of behaviour - in other words, different personalities. Such a hive would have its share of internal conflicts, just like our minds do; the defining feature would not be lack of individuality but absolute loyalty to the hive as a whole.

>It wouldn't be creating TV shows or the like for itself since what would be the point? It would already know everything about any given episode before it aired.
Broadcasting and storing every memory at every unit would quickly become unfeasible as the hive grows. Units would have local working memory, and the common memory would be handled by specialized units. This would solve such problems, and also allow things like military exercises.

That would depend on how exactly a hivemind composed of humans would work.
Most likely though, a hivemind wouldn't be a "human" mind even though composed of humans, so it has no reason to even have "fundamentally human qualities".

My favorite example of a benevolent hive mind: orionsarm.com/eg-article/486028564241d

couldn't it also be having empathy for say your hand? sure a hive mind could "grow" another "hand" and that's great but I imagine losing something that is a vital part of you would still fucking hurt.

Iunno but hive minds are really fascinating to think about as long as they don't end up like the Borg Queen.

who are these glowy eyed bitches and why do they have their foreheads showing?

>A true hive mind is one mind, period.
Than thats not a hive mind, thats a gestalt conciouness. A hive mind is a litteral hive of minds.

That all comes down to scale really, a hive mind of a trillion minds wouldn't care if a couple of million died where as a mind formed from a mere hundred or so million would definitely feel hurt.

Define "hivemind".

If split in half, do both halves of the hivemind continue to function separately, albeit in reduced capacity?
Or one of the halves either dies, or ceases the connection between the members?

A hive mind is not more "rational" than a single mind. Its pure reasoning skills don't improve. A hive mind can still fall prey to logical fallacies, cognitive biases, etc. A hive mind is not immune from loss aversion or the gambler's fallacy.

A hive mind does gain a huge amount of information resources. How useful this is depends on the efficiency and trust of communication across the hive.
Bees and ants are very effectively at scouting a large area. However, any new information must first be carefully verified before the hive as a whole reaches concordance. If 1 ant reports a new food source, it will bring 2 or 3 ants with it on the next trip. This slowly builds until all the workers are aware of the food location. If the entire hive immediately trusted a single ant, the hive mind would die out from being too gullible.

Individual members of a hive still have individuality. They still have a survival instinct. Even though they serve a common higher purpose, they still disproportionately value their own ideas and experiences.
Bee 219 values the hive more than her own life, and will sacrifice herself for the good of the hive. But Bee 219 still thinks that Bee 125 and Bee 97 are idiots, and Bee 219 probably wouldn't sacrifice herself just to save 1 or 2 other bees.

A hive mind may appear hyper-rational simply because it often makes smarter decisions than an individual. But in truth, the "wisdom of crowds" is often more jumpy and irrational than any individual.
A highly-optimized hive is as smart as the smartest individual within it. A poorly-optimized hive is as stupid as the stupidest individual within it.

Patwician taste