This is a split board - You can return to the Split List for other boards.

Nature of the soul and memory

#11GuideToTheDarkPosted 9/19/2013 1:19:47 PM
If there's no electricity flowing through it, the code is just blips on a magnet, not a program. Interesting.
---
SHUTUPPU ANDE EAT! TOO BAD NO BON APPETIT!
#12Eastsideslinger(Topic Creator)Posted 9/19/2013 1:27:10 PM
Moorish_Idol posted...



I think the mind require a link with a brain to be of any discernible use. A device able to interpret the brain's content would just be simulating consciousness -- I don't think it would have much ability to function beyond the data it has, as our mind now can. In other words, it would be limited to the content that was transferred -- it wouldn't be able to create new content.

I wouldn't call a simulated brain a true "self", much to the disappointment of the Bicentennial Man.


So you would say the the physical component, the brain, if left without the mind would simply contain the remnents of thought or memory but be unable to process it in any meaningful way without the soul to drive interpretation?
---
"That's Mushy Snugglebites' badonkadonk. She's my main squeeze. Lady's got a gut fulla' dynamite and a booty like POOOW!" - Tiny Tina
#13kozlo100Posted 9/19/2013 1:30:35 PM
GuideToTheDark posted...
If there's no electricity flowing through it, the code is just blips on a magnet, not a program. Interesting.


Yea, that's more or less my view.
---
Time flies like the wind,
and fruit flies like a banana.
#14ThuggernautzPosted 9/19/2013 1:34:05 PM
Moorish_Idol posted...

I think the mind require a link with a brain to be of any discernible use. A device able to interpret the brain's content would just be simulating consciousness -- I don't think it would have much ability to function beyond the data it has, as our mind now can. In other words, it would be limited to the content that was transferred -- it wouldn't be able to create new content.

I wouldn't call a simulated brain a true "self", much to the disappointment of the Bicentennial Man.


Ignoring the first part because it's just a lot of empty speculation, but this part interests me. When the simulation becomes indistinguishable from the real thing, how can you distinguish between the two and of what importance is the dichotomous distinction? Moving on, we already have creative computing heavily involved in practically all aspects of life.

This program creates 'classical' music, which in many cases listeners chose as having more 'soul' or emotion than compositions by humans:

http://arstechnica.com/science/2009/09/virtual-composer-makes-beautiful-musicand-stirs-controversy/
#15kozlo100Posted 9/19/2013 1:47:25 PM
I heard that the music composed by that thing generally felt flat and technical to trained ears, even in blind listens. Could be different audiences though, I'm sure I couldn't tell the difference.
---
Time flies like the wind,
and fruit flies like a banana.
#16ThuggernautzPosted 9/19/2013 1:54:48 PM
kozlo100 posted...
I heard that the music composed by that thing generally felt flat and technical to trained ears, even in blind listens. Could be different audiences though, I'm sure I couldn't tell the difference.


Yep, there's certainly those reports too. It seems, like most other music, tastes vary. One could almost mark that down as a selling point for the creativity of the AI. ;)
#17Moorish_IdolPosted 9/19/2013 3:27:55 PM
kozlo100 posted...
On this second part, what is it that you think is special about the brain that it the only thing a mind can link to? Or to put it a little more technically, what is it you think a synapse can do that a transistor can't?

The main reason I don't think the mind links to anything else, like a computer, is because we haven't seen inanimate objects have a mind like ours. While I think it may be possible to eventually achieve sentience in a computer/A.I. (awareness of purpose), I find it unlikely to achieve sapience (awareness of meaning).

That said, hypothetically a mind could perhaps use a computer as a conduit just as it uses the brain as a conduit. I think there would be a lot of barriers in it that are absent in our brain, though, since I'm not sure we could physically create something that isn't less complex than ourselves. So assuming it's possible to link to a computer, it still wouldn't be equal to a brain.

Eastsideslinger posted...
So you would say the the physical component, the brain, if left without the mind would simply contain the remnents of thought or memory but be unable to process it in any meaningful way without the soul to drive interpretation?

I think a brain without a mind would stop functioning altogether. This is different from conditions like vegetative states, where some basic functions still operate (or, in some cases, thought processing still occurs, but can't manifest externally). I think a brain needs a mind in order to function, even at the basic level, since the mind provides consciousness.
#18Moorish_IdolPosted 9/19/2013 3:27:59 PM
Thuggernautz posted...
Ignoring the first part because it's just a lot of empty speculation, but this part interests me. When the simulation becomes indistinguishable from the real thing, how can you distinguish between the two and of what importance is the dichotomous distinction? Moving on, we already have creative computing heavily involved in practically all aspects of life.

I think the dichotomy would be apparent in seeing if the simulation can do something beyond the programmer's design.

For example, if a simulation was given no data regarding metaphysical existence, such as god, would it ever be able to postulate the existence of such things? Would that music contraption you linked to have been able to create music if it was never programmed to?

In order to be truly indistinguishable, a simulation would have to do things that were never intended in its design.
#19kozlo100Posted 9/19/2013 3:51:17 PM
Moorish_Idol posted...
The main reason I don't think the mind links to anything else, like a computer, is because we haven't seen inanimate objects have a mind like ours. While I think it may be possible to eventually achieve sentience in a computer/A.I. (awareness of purpose), I find it unlikely to achieve sapience (awareness of meaning).


My way around that is via noting that we also don't see inanimate objects that are analogous to our brains, except other brains, and some of them do show signs of sentience and sapience, though it's hard to tell due to communication barriers. From there it's just if a transistor can do what a synapse can do, then the brain is a finite size with finitely many connections, so it's just a matter of engineering at that point.

Still, I do respect your approach.

On Thugg's point, and your response to it. How many humans would come up with metaphysical concepts or music if they were not taught about them? Obviously some, because we did as a species, but the vast majority of us do it because we are taught to, or it arises out of some facet of our nature, which is also a kind of programming. It makes me think the question has fewer implications than it seems when applied to single systems like the music machine.
---
Time flies like the wind,
and fruit flies like a banana.
#20Moorish_IdolPosted 9/19/2013 5:01:06 PM
kozlo100 posted...
My way around that is via noting that we also don't see inanimate objects that are analogous to our brains, except other brains, and some of them do show signs of sentience and sapience, though it's hard to tell due to communication barriers. From there it's just if a transistor can do what a synapse can do, then the brain is a finite size with finitely many connections, so it's just a matter of engineering at that point.

Fair point. My question would be: if we achieve engineering something with equivalent transistors, will it be able to do the functions that I am assuming require the aide of the mind? If it can, then my idea of dualism is wrong. If it can't....

Still, I do respect your approach.

Thanks for saying this, seriously. I've discussed the mind-body problem with a lot of people over the years because it's a big interest of mine, and it's nice hearing you acknowledge my approach even if you don't agree with it. It gets a bit discouraging hearing people say "No evidence, blind assertion," all the time as if I've trying to convert them.

On Thugg's point, and your response to it. How many humans would come up with metaphysical concepts or music if they were not taught about them? Obviously some, because we did as a species, but the vast majority of us do it because we are taught to, or it arises out of some facet of our nature, which is also a kind of programming. It makes me think the question has fewer implications than it seems when applied to single systems like the music machine.

This is a fair point too. I considered not responding to the music thing because I didn't want to seem like I'm proposing the mind is responsible for music creation. I was more addressing the idea that the machine created music, when in reality the machine was simply doing what it was programmed to do.

It's true that nature has programmed our brains. But there seems (to me) to be some intrinsic link between nature and super-nature that isn't present between synthetics and super-nature.