Worm’s brain mapped and replicated digitally to control obstacle-avoiding robot.

  • Warl0k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    3 months ago

    You’re coming at this from a slightly askew angle. Consciousness is holographic - that is, it’s complex behavior arising in the interaction of a more complex system. There’s nothing “more” to it than what we see. The transporters from startrek, which destroy then reproduce exactly, would change nothing about your experience. You’re just a complex arrangement of atoms, and it doesnt matter where that arrangement occurs so long as it’s unique. There is no “you”, there’s just “stuff” stuck together in a way that lets it think that it can think about itself. A perfect reproduction would result in the same entity, perfectly reproduced.

    • octopus_ink@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      3 months ago

      A perfect reproduction would result in the same entity, perfectly reproduced.

      It would, but I remain convinced that the continuity of my experience would end, same as if I died, and the entity who came out the other side would believe itself to be me, and believe itself to be unscathed, but actually exist only until the next time it got into a transporter, when the cycle would happen again.

      • originalucifer@moist.catsweat.comOP
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        3 months ago

        continuity of my experience would end

        why? what property is altered that would ‘end continuity’? kinda just sounds like a personal delineation… a personal preference. like being annoyed at being ‘interrupted’.

        • octopus_ink@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          3 months ago

          I don’t think I can defend my position very cogently or I’d argue against other interpretations more vigorously - and as I’ve said I’d love to be wrong. It’s certainly at or beyond the depth of my understanding of consciousness, but that doesn’t mean I accept that yours is necessarily more valid. (no snark intended with that comment)

          When I bring it up I get challenged to articulate why I feel that way and inevitably get presented with a question like yours that I can’t answer - but generally no one gives me a “here’s why you are wrong” argument, they just give me “you can’t differentiate between what you’ve posited and a nondestructive consciousness transfer and therefore you are wrong.” I maintain that my lack of ability to articulate that difference reflects poorly on me, but doesn’t actually prove I’m wrong.

          For example, I don’t think my inability to articulate a ‘property that is altered’ represents a weakness in my position, and I’m not sure a property needs to be altered for my understanding to be true.

          Using (very poorly and atypically) the ship of Theseus example, I think we’d agree that if I had two absolutely identical sets of shipbuilding materials, down to the atomic level, or further, down to the state of all observable properties of that matter and the particles that make it up, (I have no idea how one would achieve such a thing), and built a ship from one set of those materials, then vaporized that ship and built another that was 100% identical using the second set of those materials, those ships would be two identical but distnict entities. I don’t think I’ve seen an argument that convinces me that the same wouldn’t be true for pulling my consciousness (ephemeral and subjective as it may be) and body through a transporter or other such destructive process.

          Your argument feels like you are telling me that if I use a replicator to make two different but identical cups of earl grey hot they are actually the same cup of tea, when plainly they are not. Considering (sticking with star trek) the stories of duplicates due to being stuck in the “pattern buffer” or similar handwavium, it seems clear that the ST transporter is capable of creating multiple entities. The only difference between a normal transporter experience and one of those freaky transporter accidents seems to be whether the two entities are both alive at the same time.

          COULD there be (since we’re in the realm of scifi anyway) some method of transferring consciousness that wouldn’t seem like death to me? Yes I’m sure there could. But I don’t think I’ve seen one in any popular scifi, at least not that I can think of right now.

          • originalucifer@moist.catsweat.comOP
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            3 months ago

            youre not wrong in that cloning you twice would immediately create 2 distinct entities. and their consciousness/brains would immediately differentiate. so? now theres 2 of you.

            i dont see the problem with there being 2 versions of you instead of the 1 that was destroyed and recreated in a transporter. its the experience that makes the differentiation, and if there is only 1 of you at a time there is no differentiation. only one of you continues experience, there you are.

            • octopus_ink@lemmy.ml
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              3 months ago

              and if there is only 1 of you at a time there is no differentiation. only one of you continues experience, there you are.

              In my interpretation it’s a different one of me, and that matters. Granted, I don’t expect either of us are on a path that is likely to convince the other, but fundamentally that’s my objection. (see my two different ships example)

              • Warl0k3@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                The fundamental difference between your two positions seems to be that an identical ship that was created would be a fundamentally different ship. But that’s just something you’ve assumed. Why would that actually be the case? What, when you really get down to it, would be the difference that you could point to and say “ah, this one is a copy”? They would be, truly, definitionally, the same object. The differences between an original and a duplicate that existed together would only appear after they were created - if they appeared before they were created, then (again definitionally) they wouldn’t be identical copies.

                If you destroyed the original and then created the duplicate, there wouldn’t be any differences - it would be created as an identical version, and continue being that version, accumulating differences only to itself. Nothing about it would have diverged from that instant of creation. How could it? There’s nothing to diverge from. If you can assume that there could be an original that isn’t destroyed, and then a copy created of it, then why couldn’t you just swap those labels around? Have a duplicate, and create an original from it. If for an instant they’re the same, then… er… there’d be no difference. The labels are just be a human affectation.

                Think of it like transferring a file. I’m sure you’ve moved a file onto a different drive or dragged something from your downloads folder to your desktop or somesuch similar action. What actually happens is that the file is frozen to modification, copied from one place to the other, then deleted from the first place. But in all the times you’ve done that, have you ever thought to yourself “huh, you know, this isn’t actually the same file as what I initially clicked on”. And that’s because fundamentally, mathematically, it is the same file. Changes to the file follow it around when it’s moved again, if you change the name it’s still referring to the same piece of data, etc. It’s the same, single file.

            • Zoot@reddthat.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Its a huge difference. I for one like waking up after I go to sleep, if I didn’t wake up then the real “me” ceases to exist.

              Similar to a transporter, you can never be truly sure its the original “you” waking up on the other side.

          • Sanctus@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 months ago

            We see someone’s POV going through a transporter, you just see where you are, sparkles, and now you’re somewhere else. The unease probably comes from the uncertainty. The mere fact we can’t ascertain what really happens in a transporter to your consciousness is very suspect in a universe like Star Trek where we find science babble for everything.

            Though, think of it for a moment. Your atoms are being torn apart and the structure is being rebuilt somewhere else. That totally just sounds like you die. I wouldn’t want to go in there either.

            • octopus_ink@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              3 months ago

              Your atoms are being torn apart and the structure is being rebuilt somewhere else. That totally just sounds like you die. I wouldn’t want to go in there either.

              Exactly.

              Again though, if the technology were actually real, I would expect that there would be a laymen-friendly version of why it wasn’t actually death that I’d be able to accept. I just haven’t seen one in all the times I’ve had this discussion.

        • LwL@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 months ago

          Think of an alternative scenario, not transportation but rather duplication. The original stays where it was, but a copy gets created elsewhere. To the copy, it will seem as if it got transported there. To the original, nothing will have happened.

          Now you kill the original.

          The only difference is the timing of ending the original.

          • Warl0k3@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 months ago

            Ah, but it wouldn’t be a copy of the original. In a hypothetical star-trek transporter accident that results in a duplicate, there would be an instant of creation where the dupe and original would be truly identical - and then the question would be which one of those two is ‘you’? They’d be experiencing identical things, so how could you tell them apart? What would even be the point, they’re identical, there is by definition no difference between them. The differences would only come once the duplicate is exposed to different (for lack of a better term) ‘external stimuli’ than the original, like different angles of seeing the transporter room or the sensation of suddenly and rapidly growing a goatee. Your perception wouldn’t continue with the duplicate because your experience would be different than that of the duplicate’s (for example, you wouldn’t have mysteriously grown a goatee).

            If you destroyed the original and then made the duplicate, it would start at that moment of total equivalence, but there would be no deviation. There’d just be one version, that was identical to the original, moving forward through time. ‘You’ would just go on continuing to be you. Consciousness isn’t a ‘thing’ - it’s not magic, its just a weird state that arises in sufficiently complex systems. You and I and everyone else in this thread aren’t special, we’re just extremely densely packed networks that have the ability to refer to themselves as an abstract concept.

            It’s a similar thing to the classic “But how do I know that what I see as the color green is what you see as the color green” question. The answer is that the “color green” that we see isn’t real, ‘green’ is just a nerve impulse that hits a network. Each photoreceptor just sends a signal. If we were computers the world would be represented as an array of values, which results in the much clearer “How do I know what I see as G_101.34 is what you see as G_101.34” just isn’t quite as punchy a question.

        • fishos@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          3 months ago

          “what property is altered”

          Ummm, the part where you are a continuous object that is suddenly disassembled.

          Dont be intentionally obtuse. Yes, this is a ship of thesis type problem, but there’s a very clear point when you stop being “you” - when you’re a stream of atoms. Although many versions of a teleporter don’t transmit the atoms, only the data of how they’re arranged. In that case, you are very distinctly a photocopy, as no original atoms remain.

          In the case of atom transfer, you stop being you during the time you are a bundle of atoms with no consciousness. Some people believe we’re like a forever stew and if you shut it down like that and reboot it, it’s not the “same” stew anymore because it wasn’t just the emergence of the consciousness, but the specific emergence itself. Essentially You v1 died in its sleep and You v2 seamlessly took it’s place without knowing. Tho that line of thought could applied to sleeping and loss of consciousness during surgery.

          All of this is to say it’s not a cut and dry answer and people claiming there’s a diffinitive, clear cut answer are incorrect. It’s a complex question that touches on the very nature of our existence and is still hotly debated. Even academics who believe we are purely chemical machines debate exactly how that works.

          • Warl0k3@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            3 months ago

            As an academic with a great deal of experience in this field, I can quite confidently say that it’s not a debated topic at all. At least, not among academics. We’re (somewhat predictably) called to debate it with representatives of the various religions and spiritual creeds on an almost continuous basis, though.

            And it really isn’t academically debated - topics surrounding it, like the nature of the conditions leading to the formation of networks which form a ‘mind’ admittedly are debated, but the fundemental truth that a ‘mind’ is a holographic pattern arising from said network is quite a settled topic, and has been for thirty-some years now.

            • fishos@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 months ago

              Ok, so what is the exact process that creates consciousness? Cus that’s what I’m saying is debated but you apparently have that answer. So what EXACTLY, down to the atomic level, is consciousness? What processes and how do they emerge into consciousness?

              I’ll be waiting for your exact, undebated answer.

              • Warl0k3@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                3 months ago

                Can’t, but I suspect not for the reason you’re hoping. The consensus, at least among computational neurologists (the field that, among other things, studies how brains work mathematically), is that “consciousness” as a concrete thing isn’t really… real. It’s just a term humans created to loosely describe a phenomenon that arises from any sufficiently complex well-ordered network. If you want to know what it really looks like, you can run your own OpenWorm robot! The human ‘mind’ looks just like that, only around a dozen orders of magnitude more complex.

                The problem is that you’re asking mostly meaningless questions. Even the loose definitions of consciousness aren’t definable to the ‘atomic level’ - a mind is a mathematical construct. It’s like asking where the files on your computer live; I can point to the sectors of the harddrive where a program is encoded, or even hand you a really really massive stack of punched tape, but neither of those actually are the computer program. What we call the program is the interaction of a grammar consisting of logical rules and constants running within the linguistic and computational context of an automata. It’s the same as with a mind - it’s the abstract state of an unfathomably complex machine.

              • originalucifer@moist.catsweat.comOP
                link
                fedilink
                arrow-up
                2
                ·
                3 months ago

                i thought they explained it quite nicely as a system arising from other systems…

                but the fundemental truth that a ‘mind’ is a holographic pattern arising from said network is quite a settled topic

    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      3 months ago

      The physical world is the hologram.

      Between saccades, fnords, and confabulation, I don’t trust a single thing my senses tell me. But the one thing I know for sure is that I’m conscious.

      So, knowing that only consciousness is “real”, why would I assume it can be recreated through atoms (which are a mere hallucination)?

          • Warl0k3@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            3 months ago

            Alas, philosophers answer questions about the interrelation of minds, but not what a mind actually, chemically, is. They can extemporize at great length on the tendencies of a mind, the definition of consciousness, the value of thought, the many many vagaries of morality. They cannot, unfortunately, sit down and draw a picture of a mind. Many good and important questions can be answered by philosophers, but not every problem can or should be assessed with the tools they have.

            You may be conscious, and you may have many long and deeply opinionated thoughts about what it means to be conscious, and how you can know that you are in fact conscious, but you cannot tell me what consciousness looks like. And to be perfectly honest, I don’t really care.

            I don’t know if you’ve ever done this, but you should sometime present an engineer with the trolley problem. I’ve done this many times, and the invariable result is that they will ask endless questions to establish the parameters and present endless solutions within those parameters so that nobody has to die at all. It is, in short, a problem. Not an ontological tool for unlocking hidden understanding, which falls under the purview of your ‘philosophy’, but a practical problem. Like how you’re going to prevent some big mean mother-hubbard from tying you to the hypothetically metaphorical trolley tracks. And the solution? Is a gun. And if that don’t work, use more gun. Like this heavy caliber tripod-mounted little old number designed by me. Built, by me.

            And you best hope, not pointed at you.

            • kibiz0r@midwest.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 months ago

              You’re presupposing the superiority of science. What good is knowing the chemical composition of a mind, if such chemicals are but shadows on the cave wall?

              You can’t actually witness a rock, in its full objective “rock-ness”. You can only witness yourself perceiving the rock. I call this the Principle of Objective Things in Space.

              Admittedly, the study of consciousness is still in its infancy, especially compared to study of the physical world. But it would be foolish to discard the entire concept when it is unavoidably fundamental. Suppose we do invent teleporters and they do erase consciousness. Doesn’t it say something about the peril of worshipping quantification over all else, that we wouldn’t even know until we had already teleported all of our bread? The entire field is babies. I am heavy ideas guy and this is my PoOTiS.

              • Warl0k3@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                3 months ago

                (I am absolutely going to steal the Principle of Objective Things in Space, that’s wonderful.)

                There’s a drive philosophers have, to question why things are the way they are, through a very specific lens. Why is it wrong to push a fat man onto the trolley tracks, if his death would save six others? Why is there a difference between the perception of the shadows and the perception of the man with the shadow puppets? Does free will exist, and why does that matter?

                These are all the pursuit of meaning, and while they are noble and important questions to ask, they are not questions driven by the pursuit of understanding. Philosophy depends on assumptions about the world that are taken to be incontrovertible, and bases it’s conclusions from there. The capacity for choice is a classic example, as is the assumption of a causal universe, and though they’re quite reasonable things to assume in most cases, it can get mind-bleedingly aggravating when philosophers apply the same approach to pure fields like mathematics, which require rigorous establishment of assumptions before any valid value of truth can be derived.

                Which is not to attack philosophers. I want to be clear about that, I bring this up just to emphasize that there are differences in thought between the two disciplines (that occasionally those differences in thought make me want to brain them with a chair is unrelated to the topic at hand). The philosophical study and speculation as to and on the nature of consciousness is perhaps the single oldest field of inquiry humanity has. And while the debate has raged for literal ages, we haven’t really gotten anywhere with it.

                And then, recently, scientists (especially computer scientists, but many other fields as well) have shown up and gone “hey look, we can see what the brain looks like, we know how the discrete parts work, we can even simulate it! Look, we’ve got the behavior right here, and… well, maybe… when we get right down to it, it’s just not all that deep?” And philosophers have embraced this, enfolded it into their considerations, accepted it as valid work… and then kept right on asking the exact same questions.

                The truth is, as I’ve been able to study it, that ‘consciousness’ is a meaningless term. We haven’t been able to define it for ten thousand years of sitting around stroking our beards, because it’s posited on assumptions that turn out to be, fundamentally, meaningless. It’s assumed there is another layer of abstraction, or that there’s a point or meaning to consciousness, or anything within the Theory of Mind. And I think it’s just too hard to accept that, maybe, it all… doesn’t matter. That we haven’t found any answers not because the question is somehow unanswerable, but because the question was asked in a context that invalidates the entire premise. It’s the philosophical equivalence of ‘null’.

                Sufficiently complex networks can compute and self reference, and it turns out when you do that enough, it’ll start referencing The Self (or whatever you’d like to call it). There’s no deeper meaning, or hidden truth. There’s just that, on a machine, a simulation can be run that can think about itself.

                Everything else is just… ontological window dressing. Syntactic sugar for the teenage soul.

              • originalucifer@moist.catsweat.comOP
                link
                fedilink
                arrow-up
                2
                ·
                3 months ago

                consciousness does not exist outside the physical world (nothing does), so why would you remove it from the study of the physical world?

                why would an exact replica not have all the same properties, including consciousness? or is this just an extraordinary claim without evidence?