Consciousness

by IronGland 4 Replies latest jw friends

  • IronGland
    IronGland

    Last night as usual I was laying in my pod meditating when I began to consider consciousness.I have long held the view that artificial intelligence is just a matter of time away, and that when it comes it will match human intelligence for a few brief days as some of the sillier bugs are eliminated from the code. We are in the process of creating our replacements, the question is really whether we will find a way to come along for the ride.

    In thinking about artificial intelligence I wandered down the path of considering determinism and it's impact. The Church-Turing thesis is nice in that it suggests anything computable can be done in a turing machine. One issue I had for a while is that you could always predict (to a finite time in the future) what a Turing machine would do. This forced me to consider myself a compatabilist.

    Lately, I have been thinking that in some ways consciousness is an illusion. Something that we pat ourselves on the back about, as making us somehow special, or above all other forms of life. In reading a little more Bertrand Russell recently I encountered a section where animal behavourists were routinely encountering behaviour in animals that reflected their own bias. This got me a little way toward considering that a) the difference of our consciousness from animals was in degree not kind and b) that we can be very self deceptive of what we see in human traits.

    I've also been fleshing out my views on epistemology and the mind. Increasingly I am seeing the mind more simply as inputs, outputs, memory and changeable processing structures with inherited structures for all these components.

    Epistemically we have emotions and intuition, which are part of the inherited structures with an evolutionary benefit, which motive us to act and observe and to create beliefs as a result. We learn some basics as children and lacking the rational capacity of an adult treat these foundationally. As we develop rational capability increases, as does our set of beliefs, and some of us are motivated to coherency in our belief sets - it's been useful.

    There have been some suggestions that we resolve internal conflicts through managing cognitative dissonance. Nothing particularly earth shattering about this, but if you think of what a Turing machine is and what it can do. And think about the effect of complexity and chaotic systems, say the following one: -

    • next value = 3.9 * (last value - 1)
    And consider that this behaves unpredictably (well sort of) but within a finite range of possibilities. Well, I ask myself the question. How is this different in kind from a mind and it's consciousness. I come to the conclusion that it is no different in kind, only in degree.

    Consciousness is then an arbitrary point we choose to define somewhere in the range of complexity of processing systems, special in degree not in kind.
  • Terry
    Terry

    Well....

    Humans have needs and necessity drives their yearning into accomplishments and, beyond that, into excess.

    Machines, (A.I.) endowed with processing of any magnitude, have no necessity; no yearning and seek no ends except what has been programmed into them.

    A machine would never build a machine. A human might want it to. But, for what purpose would any machine even function except to follow the path the human's necessity blazed?

    We build tools because we are small and weak.

    We cannot fit the Universe inside our head. So, we break it into tiny pieces. Our tools enable this to happen. But, it is all so that we, the human, can digest our enviornment in a way meaningful to our purpose. And that purpose lasts such a short time.

    Remember, in all such discussions as this, the human is really talking about himself. He is rather like the man who wants to perform his own brain surgery. He cannot open his own skull and poke around due to pain. If he sedates himself he cannot perform the operation.

    If he touches the wrong spot he might disconnect from any purpose at all.

    No, talking about our "consciousness" is not what we do best. Mainly because it is "us" in the way a flashlight at night helps us read a magazine article. Which is to say, it enables.

    But, we can be conscious and not focus. We can be awake and not take anything in at all. We must focus, we must endeavor to comprehend or we are just a blinking piece of meat.

    Consciousness by itself is better than nothing. But, not much better without WILL and purpose.

    T.

  • tetrapod.sapien
    tetrapod.sapien

    I have read a bit of Raymond Kurzweil, and i'm kind of morbidly fascinated by the possibility of a singularity. like you say, it would be up to us to actually find a way to survive. i mean if we were able to create nano-factories that were online, and then a singularity started to accur, there's a good chance we would be totally screwed. the sad part is, that greed would probably blind us to a coming singularity until it was too late. there are a lot of very intelligent people working towards a singularity right now. it may never happen. but it also just may. i mean: evolution.

    sorry irongland, i don't have much to say about consciousness. it's one of those areas that, along with mortality, i am still trying to grasp.

    irongland: Consciousness is then an arbitrary point we choose to define somewhere in the range of complexity of processing systems, special in degree not in kind.

    i am leaning toward this view as well. it only seems to us like our consciousness is greater than the sum of it's parts. but really, like any complex system, it's made of many smaller parts with simpler functions.

    terry: A machine would never build a machine. A human might want it to. But, for what purpose would any machine even function except to follow the path the human's necessity blazed?
    i can see how it is a tall order, no doubt. but, we, afterall are just bio-machines, with bio-computers, designed by a blind watchmaker (to borrow a popular term). evolutionary computation really contains some real promise, IMO, towards developing an actual artificial intelligence. by the time it is obvious that it's conscious, it might be too late. then again, this is all just speculation of course.
  • JamesThomas
    JamesThomas
    Consciousness by itself is better than nothing. But, not much better without WILL and purpose.

    How is it then, that the most deeply beautiful, memorable, vibrantly alive and breathtaking moments experienced are those in which we are void of all "WILL and purpose" and simply silently present with a sunset, or a vista, or an art piece, or a child, or a garden, or our lover, or whatever?

    Could it be that Consciousness is infinitely vaster than what the mind/ego will ever, ever admit?

    j

  • Satanus
    Satanus

    I wouldn't worry about machines superior to us making us extinct. I mean look at all the inferior animals, bugs and plants that we like to have around. Some are going extinct, but that is because we inadvertently or greedily change or destroy their habitats. Sometimes we just wipe em out for sport. But anyway, we want them around. So, maybe the machines would want us around, or wouldn't care one way or the other. Also, we would give intelligent, conscious machines a real good run for their money if it came to competition. More likely, we would cooperate. Iain m banks, in his culture sf book series (consider phlebus, the player of games, state of the art, use of weapons, etc) would give you some ideas of the possibilities.

    S

Share this

Google+
Pinterest
Reddit