this post was submitted on 29 Jul 2023
191 points (100.0% liked)

Technology

37717 readers
404 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] maeries@feddit.de 22 points 1 year ago (2 children)
[–] 100years@beehaw.org 14 points 1 year ago (1 children)

Wow, solid wiki article! It's very hard to say anything on the subject that hasn't been said.

I didn't see the simple phrasing:

"What if the human brain is a Chinese Room?"

but that seems to fall under eliminative materialism replies.

Part of the Chinese Room program (both in our heads and in an AI) could be dedicated to creating the experience of consciousness.

Searle has no substantial logical reply to this criticism. He openly takes it on faith that humans have consciousness, which is funny because an AI could say the same thing.

[–] FlowVoid@midwest.social 5 points 1 year ago* (last edited 1 year ago) (1 children)

The whole point of the Chinese room is that it doesn't need anything "dedicated to creating the experience of consciousness". It can pass the Turing test perfectly well without such a component. Therefore passing the Turing test - or any similar test based solely on algorithmic output - is not the same as possessing consciousness.

[–] lloram239@feddit.de 3 points 1 year ago (1 children)

The problem with the Chinese room thought experiment is that it does not show that, at all, not even a little bit. The though experiment is nothing more than a stupid magic trick that depends on humans assuming other humans are the only creatures in the universe that can understand. Thus when the human in the room is revealed to not understand anything, therefore there was be no understanding anywhere near the room.

But that's a stupid argument. It does not answer the question if the room understands or not. Quite the opposite, since the room by definition passes all tests we can throw at it, the only logical conclusion should be that it understands. Any other claim is not supported by the argument.

For the argument to be meaningful, it would have to define "understand", "consciousness" and all the other aspects of human intelligence clearly and show how the room fails them. But the thought experiment does not do that. It just hopes that you buy into the premise because you already believe it.

[–] FlowVoid@midwest.social 2 points 1 year ago* (last edited 1 year ago) (1 children)

"The room understands" is a common counterargument, and it was addressed by Searle by proposing that a person memorize the contents of the book.

And the room passes the Turing test, that does not mean that "it passes all the tests we can throw at it". Here is one test that it would fail: it contains various components that respond to the word "red", but it does not contain any components that exclusively respond to any use of the word "red". This level of abstraction is part of what we mean by understanding. Internal representation matters.

[–] lloram239@feddit.de 3 points 1 year ago (1 children)

it was addressed by Searle by proposing that a person memorize the contents of the book.

It wasn't addressed, he just added a layer of nonsense on top of a nonworking though experiment. A human remembering and executing rules is no different from reading those rules in a book. It doesn't mean a human understands them, just because he remembers them. The human intuitive understanding works at a completely different level than the manual execution of mechanical rules.

it contains various components that respond to the word “red”, but it does not contain any components that exclusively respond to any use of the word “red”.

Not getting it.

[–] FlowVoid@midwest.social 1 points 1 year ago (1 children)

The human intuitive understanding works at a completely different level than the manual execution of mechanical rules.

This is exactly Searle's point. Whatever the room is doing, it is not the same as what humans do.

If you accept that, then the rest is semantics. You can call what the room does "intelligent" or "understanding" if you want, but it is fundamentally different from "human intelligence" or "human understanding".

[–] lloram239@feddit.de 1 points 1 year ago* (last edited 1 year ago) (1 children)

This is exactly Searle’s point. Whatever the room is doing, it is not the same as what humans do.

He fails to show that. All he has shown that the human+room-system is something different than just the human by itself. Well, doh, nobody ever assumed otherwise. Running a NES emulator on my modern x86-64 CPU is something different from running an original NES too. That doesn't mean that the emulator is more or less capable than the real NES or that the underlying rules driving the emulator are different from the real thing. You have to actually test the systems and find ways in which they differ. Searle's experiments utterly fails here.

[–] FlowVoid@midwest.social 1 points 1 year ago

All he has shown that the human+room-system is something different than just the human by itself.

It's more than that. He says that all Turing machines are fundamentally the same as the Chinese room, and therefore no Turing machine will ever be capable of "human understanding".

Alternately, if anyone ever builds a machine that can achieve "human understanding", it will not be a Turing machine.

[–] reflex@kbin.social 4 points 1 year ago* (last edited 1 year ago) (1 children)

en.wikipedia.org/wiki/Chinese_room

Man, I love coming across terms like this.

Chinese Room, Chinese Walls, Dutch Treat, Dutch Uncle, Dutch Oven.

[–] ivanafterall@kbin.social 1 points 1 year ago (2 children)

Wow! Me, too! What is a Dutch Oven!?

[–] shanghaibebop@beehaw.org 5 points 1 year ago (1 children)
[–] reflex@kbin.social 1 points 1 year ago* (last edited 1 year ago)

Or a fart in a blanket :)

*Satisfied nod.*

[–] FlowVoid@midwest.social 1 points 1 year ago

A covered pot.