this post was submitted on 25 Sep 2023
172 points (85.0% liked)

Asklemmy

43948 readers
493 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Interesting article didnt know where it fit best so I wanted to share it here.

you are viewing a single comment's thread
view the rest of the comments
[–] bloodfoot@programming.dev 52 points 1 year ago (3 children)

Interesting but I struggle to see how this hypothesis could ever be proven or disproven. If it can’t actually be tested then I don’t see how it presents more scientific value any other religious or superstitious belief.

[–] FaceDeer@kbin.social 25 points 1 year ago* (last edited 1 year ago) (3 children)

I've long been fond of panpsychism, but I think it's less a hypothesis to be "proven" and more just a different way of framing the questions behind what consciousness is and how it can be defined. Under panpsychism consciousness isn't a binary property that some things have and other things don't, it's a continuum from zero to one (and if you count humans as "1" on the consciousness scale it also makes sense to consider values above that - there's no reason to assume that humans are the "most conscious possible" state of being).

So when you're reading about panpsychism and it says something like "individual electrons are conscious", bear in mind that they're proposing considering electrons to be, like, 10^-10 "consciousness units" worth of conscious. It's not like they're actually aware of themselves in some meaningful way like humans are. That's a common "giggle factor" problem for panpsychism. And it's also not saying that any arbitrary larger-scale structure us "more conscious" than humans, the way that the components of a large-scale structure interact is super important. A rock is not equivalently as "conscious" as a human brain even if they have the same number of particles interacting within them.

[–] bloodfoot@programming.dev 11 points 1 year ago (1 children)

I think the real issue is with the fact that consciousness is not particularly well defined. Something can be more or less conscious than something else but what precisely does that mean? Has there ever been a means of measuring or detecting consciousness in anything?

[–] 0ops@lemm.ee 7 points 1 year ago (1 children)

That's my biggest frustration with this debate. At this point I'm convinced that consciousness is only a construct. Not a tangible entity, process, or concept, just a useful way to describe behavior. If someone describes the universe as conscious that's neat and all, but it doesn't really mean anything yet. And another person could say it isn't and neither would be right or wrong, because what the hell is consciousness? Like you said, how are we supposed to measure this when we don't know what it is? Many people think we haven't discovered what consciousness is; I believe we haven't decided what it is.

[–] Poteryashka@lemmy.ml 1 points 1 year ago (1 children)

Depends on who you ask I think. Emergentism makes more sense to me because if you take consciousness as humans experience it, make it derivative of material structure (neurological activity), and assume the appearance of some kind of uniformity as synthesis of different parts of that neurological system, the only way consciousness may exist in that framing is in organisms that posses a nervous system.

This does inevitably leads to the problem of where to draw the line on the complexity necessary to qualify as consciousness, and im.not gonna pretend like I have the answer to that, but at least it becomes more of a scientific question rather than purely philosophical I think.

[–] 0ops@lemm.ee 1 points 1 year ago* (last edited 1 year ago) (1 children)

You could define it that way. I think it could be more abstract than that, personally, because

a. Is the nervous system in animals the only neural network in nature? I've heard discussion on the whether a some types of fungus are conscious from how they send chemical signals to other parts of the fungus. This is slow but does it count? And then there's the collective consciousness of ant colonies and beehives. That's a level above where each bug's nervous system is itself a node in a larger neural network.

b. I think that consciousness is more than just the nervous system. In another comment under this post I argued that a neural network (in an abstract sense) can only "think" in terms of the sensors it has access too. What does the lab-grown brain think about? It's never seen things, it's never heard sounds or words, can it feel touch? (I'm not an anatomy guy). My hunch is it's just static, essentially an "untrained" neural network". Does that count as conscious?Maybe those senses are considered a part of the nervous system, again I'm not an anatomy guy.

But then how do the "chemical computations" like hormones and gut bacteria come into play? Are they just indirectly sensed by the nervous system?

[–] Poteryashka@lemmy.ml 1 points 1 year ago (1 children)

I'm really not exactly sure what qualifies, but the existence of an emergent system so has to be there. Does fungus communication give rise to a system that can build some kind of memory and refer to it to develop more complex behavior? If not, then it's lacking the level of complexity to be considered consciousness. (But that's just where I personally draw the line)

Eusociality has its own context. It's possible for a hive to show complex organized behavior, but so would an infinite paperclip machine if it was to consist of a swarm of collector drones. A myriad of units with a set of pre-determined instructions can have complex organizations, which still wouldn't qualify as consciousness.

Now, the brain scenario would definitely count since it consists of the necessary "hardware" to start generating its own abstract contextual model of its experiences.

[–] 0ops@lemm.ee 1 points 1 year ago

A myriad of units with a set of pre-determined instructions

Like neurons? My argument was that in abstract sense, a single ant could be considered a neuron. It senses the environment and other ants for inputs, and it interacts with the environment and other ants for output. A network of ants is capable of complex behavior. By this logic of course, just about any entity could be considered a neuron, and any collection of entities a neural network, which I think is what the original article is getting at. Now is the ant colony conscious? I don't know. Am I conscious? I think so, it seems like it. Are you conscious? You seem a lot like me, and I think I probably am, so I think you probably are too. Basically what I'm saying is I haven't heard of a definition of consciousness that doesn't wind up encapsulating everything or nothing, or that isn't human-centric.

Now, the brain scenario would definitely count since it consists of the necessary "hardware" to start generating its own abstract contextual model of its experiences.

So, you're saying that you don't need experience to be conscious, just the the potential to experience? I'm not sure if I agree with that. Yeah there's diminishing returns, I don't think that an old person is significantly more self-aware than a kid in the grand scheme of things, but pretty much every thought I've ever had, that I realized I had anyway, was in terms of a sense I had, or at least derived from the senses. Even a newborn has been feeling and hearing since embryo. Now there is instinct to consider, that was evolved and while it can influence and direct consciousness, I don't think acting on instinct is a conscious act itself. I'm saying, can a brain in a jar with no contact with the world, that's never had contact with the world at any point, be aware of itself? What is self without environment?

[–] justastranger@sh.itjust.works 5 points 1 year ago (1 children)

I prefer to consider it in terms of "dimensions of awareness". Humans have evolved hundreds, possibly thousands, of interlinked dimensions of awareness for just about everything from colors to body language. Simple automated systems with sensors have their own dimensions of awareness, from vision to heat to pressure. Whatever it is that they track and respond to. AI, however, is finally hitting the point where these dimensions of awareness are being stacked and linked together (GPT5 can see, hear, read, and respond) and it's only a matter of time and agency (aka executive functioning) before we see true AI consciousness.

[–] 0ops@lemm.ee 2 points 1 year ago

I had a similar thought recently actually, that consciousness is more than the brain. Is gt4 conscious? Eh, I don't believe anyone knows what that means but is it comparable to human consciousness? I don't think so, but how could it be? It senses words, so it knows words, so it speaks words.

I hear it said all the time that llm's don't really understand what they're talking about, but they seem to understand as well as they can given the dimensions they are aware of, using your terminology. I mean how can I describe anything myself without sensory details? It sounds like. It looks like. It feels like. It behaves like. We got all that knowledge by sensing, then infering. There's no special sauce that creates understanding from nothing.

I don't have any links but imo the experiences of people who were born without a sense, and especially those who were later able to gain it back, strongly supports this idea that something can only be conceptualized in the terms that it was sensed in.

[–] lily33@lemm.ee 2 points 1 year ago

Well, hypothetically, if someone defined the "consciousness" of every particle mathematically, and then figured out the laws that would allow us to compute (or at least approximate) the "consciousness" of a composite system (such as a brain), then we'd would have a genuine scientific theory.

[–] Grayox@lemmy.ml 4 points 1 year ago

I could see it being used to help develop theories about the gaps in understanding we have about our universe in theoretical quantum mechanics. That's the only field of thought that could lead to quantifiable experiments to test hypotheses.

[–] stingpie@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (2 children)

Here's another way of framing it: qualia, by definition, is not measurable by any instrument, but qualia must exist in some capacity in order for us to experience it. So, me must assume that either we cannot experience qualia, or that qualia exists in a way we do not fully understand yet. Since the former is generally rejected, the latter must be true.

You may argue that neurochemical signals are the physical manefestation of qualia, but making that assumption throws us into a trap. If qualia is neurochemical signals, which signals are they? By what definition can we precisely determine what is qualia and what is not? Are unconscious senses qualia? If we stimulated a random part of the brain, unrelated to the sensory cortex, would that create qualia? If the distribution of neurochemicals can be predicted, and the activations of neurons was deterministic as well, would calculating every stimulation in the brain be the same as consciousness?

In both arguments, consciousness is no clearer or blurrier, so which one is correct?

[–] bloodfoot@programming.dev 7 points 1 year ago (1 children)

So our subjective experience must “exist” because we experience it? This seems rather circular. My personal take, consciousness is an artifact of how our brains work. It’s not a thing that exists in any physical sense, it is simply part of the model our brain structures the stimulation it receives throughout the course of our lives.

[–] stingpie@lemmy.world -1 points 1 year ago (1 children)

All of science is based on the assumption that what is observed and experienced exists. You cannot gather data without at some point experiencing some representation of that data. In this sense, qualia is the most real thing possible, because experience is the essence of evidence.

[–] bloodfoot@programming.dev 3 points 1 year ago* (last edited 1 year ago)

So how do you measure qualia? What is it made of? How is it actually defined? How do you detect if qualia is present in something other than your own head?

I stand by my statement that qualia is simply an artifact of our cognitive architecture. You are welcome to disagree but the arguments you are presenting fail to convince me in the slightest.

[–] Slotos@feddit.nl 4 points 1 year ago (1 children)

“We decide that it exists so it exists” is a terrible argument.

Consequently, there’s no “trap” in attributing it to neurochemical signals. Emergence is a known phenomenon, and it’s present everywhere. Asking “which signal is qualia” is as nonsensical as asking “which atom is a star” or “which transistor is the video on my phone”. It’s a deflection and misdirection.

I get it, people want to feel magical. But there’s a name to magic that works - science. Neurochemical processes are no less magical than some untestable source of experiences, with one big difference - they demonstrably exist.

[–] stingpie@lemmy.world -2 points 1 year ago (1 children)

I'm not sure I entirely understand your argument. "We decide it exists, therefore it exists" is the basis of all science and mathematics. We form axioms based on what we observe, then extrapolate from those axioms to form a coherent logical system. While it may be a leap of logic to assume others have consciousness, it's a common decency to do that.

Onto the second argument, when I mean "what signal is qualia" I'm talking about what is the minimum number of neurons we could kill to completely remove someone's experience of qualia. If we could sever the brain stem, but that would kill an excess of cells. We could kill the sensory cortex, but that would kill more cells than necessary. We could sever the connection between the sensory cortex and the rest of the brain, etc. As you minimize the number of cells, you move up the hierarchy, and eventually reach the prefrontal cortex. But once you reach the prefrontal cortex, the neurons that deliver qualia and the neurons that register it can't really be separated.

Lastly, you said that assuming consciousness is some unique part of the universe is wrong because it cannot be demonstrably proven to exist. I can't really argue against this, since it seems to relate to the difference in our experience of consciousness. To me, consciousness feels palpable, and everything else feels as thin as tissue paper.

[–] bloodfoot@programming.dev 4 points 1 year ago

Science is built upon repeatable experiments that can be used to test hypotheses. It is not built on axioms and logical extrapolation- those are used to form new hypotheses but they are insufficient by themselves. We don’t decide something exists, we hypothesize that it exists and make predictions based on that hypothesis. If experimental results line up with our predictions then we call that a theory. If new data contradicts the theory or hypothesis then we revise and try again.