ugh I remember this mf from 90s usenet, he would pontificate endlessly but never seemed to actually work on anything
-
@pozorvlak @glitzersachen @darkuncle @regehr
And you can't have thinking without the layer of emotion. Not because reasoning is emotionally motivated, but it's obviously important, so you'd need to build that in to the system.
These people think the whole brain is just emergent and not tailored to managing the human body in human contexts over deep time.
It's nonsense!
@pozorvlak @glitzersachen @darkuncle @regehr
For most of human history paragraphs of text have been a reliable sign that there is a thinking human mind that reasoned to create that text. This isn't true anymore.
But text is just like footprints. It's not the thing itself. And it's possible to fake convincing footprints and possible to fake text.
That is all that is happening.
-
@pozorvlak @glitzersachen @darkuncle @regehr
For most of human history paragraphs of text have been a reliable sign that there is a thinking human mind that reasoned to create that text. This isn't true anymore.
But text is just like footprints. It's not the thing itself. And it's possible to fake convincing footprints and possible to fake text.
That is all that is happening.
@pozorvlak @glitzersachen @darkuncle @regehr
I remember when there was a debate about if people who couldn't use language were really able to think. Wildly ableist stuff. In the course of the debate some people said that if they didn't "hear" a voice kind of like narration in their mind they weren't thinking.
Which is wild to me as someone whose thoughts are these things I struggle to condense into the limited and awkward strictures of words.
-
@pozorvlak @glitzersachen @darkuncle @regehr
I remember when there was a debate about if people who couldn't use language were really able to think. Wildly ableist stuff. In the course of the debate some people said that if they didn't "hear" a voice kind of like narration in their mind they weren't thinking.
Which is wild to me as someone whose thoughts are these things I struggle to condense into the limited and awkward strictures of words.
@pozorvlak @glitzersachen @darkuncle @regehr
When I read words from others I imagine all their big thoughts. A poem with a few dozen words can contain whole universes of emotion and ideas.
Unless it's a machine, then I imagine big matrices and all the imaginations it gobbled up to make them so it could imitate poetry.
-
@pozorvlak @glitzersachen @darkuncle @regehr
When I read words from others I imagine all their big thoughts. A poem with a few dozen words can contain whole universes of emotion and ideas.
Unless it's a machine, then I imagine big matrices and all the imaginations it gobbled up to make them so it could imitate poetry.
@pozorvlak @glitzersachen @darkuncle @regehr
And the other wild thing is that the importance of that text changes if a person simply points to it after reading it and declares "this is what I meant!"
Ok, now I care about it more. Because text is coffee straw and the mind is industrial vat full of the thickest of milkshakes.
-
@pozorvlak @glitzersachen @darkuncle @regehr
When I read words from others I imagine all their big thoughts. A poem with a few dozen words can contain whole universes of emotion and ideas.
Unless it's a machine, then I imagine big matrices and all the imaginations it gobbled up to make them so it could imitate poetry.
@futurebird I agree that current LLMs are not conscious. But nor are they simple Markov chain text generators - are you familiar with Anthropic's work on transformer circuits? Plus, "current approaches" includes hybrid systems like AlphaGeometry which combine neural networks and symbolic theorem provers. Like I said, I don't think we'll get to AGI simply by iterating on what we have now. But I didn't think we'd see an AI get an IMO gold medal this soon either.
-
@futurebird I agree that current LLMs are not conscious. But nor are they simple Markov chain text generators - are you familiar with Anthropic's work on transformer circuits? Plus, "current approaches" includes hybrid systems like AlphaGeometry which combine neural networks and symbolic theorem provers. Like I said, I don't think we'll get to AGI simply by iterating on what we have now. But I didn't think we'd see an AI get an IMO gold medal this soon either.
@pozorvlak @glitzersachen @darkuncle @regehr
I'm am aware of the variation, but I don't think any of it is really grappling with the complexity of what doing what some of them are claiming they are doing would mean.
I feel they keep showing us text, which we are biased to see as "evidence of reasoning" then claiming bigfoot exists.
And when it falls short we're told a little story about adaptive algorithms. And "soon soon soon"
It's been soon for decades. I'm so tired.
-
@pozorvlak @glitzersachen @darkuncle @regehr
I remember when there was a debate about if people who couldn't use language were really able to think. Wildly ableist stuff. In the course of the debate some people said that if they didn't "hear" a voice kind of like narration in their mind they weren't thinking.
Which is wild to me as someone whose thoughts are these things I struggle to condense into the limited and awkward strictures of words.
@futurebird
Ahh thank you for expressing this so well - I tend to say I think in thoughts, not words, and I only produce words when I need to communicate the thoughts externally. But the thoughts are kinda.. linkages and relationships between things, and patterns, I think. Like a great big relational database in my head which exists without words needing to be involved. -
@futurebird
Ahh thank you for expressing this so well - I tend to say I think in thoughts, not words, and I only produce words when I need to communicate the thoughts externally. But the thoughts are kinda.. linkages and relationships between things, and patterns, I think. Like a great big relational database in my head which exists without words needing to be involved.If there were a technology that would allow one to experience the thoughts of another person I wonder what we'd learn from those experiences?
Let's say you could produce a rough map of the state of a nervous system (not just the brain, thinking is a function of the whole body I suspect) and somehow transmit and remap it to another person. So you would feel and think for a moment some analog of another mind. Would it be Beautiful? Alienating?
Or is such a mapping impossible?
-
If there were a technology that would allow one to experience the thoughts of another person I wonder what we'd learn from those experiences?
Let's say you could produce a rough map of the state of a nervous system (not just the brain, thinking is a function of the whole body I suspect) and somehow transmit and remap it to another person. So you would feel and think for a moment some analog of another mind. Would it be Beautiful? Alienating?
Or is such a mapping impossible?
Now I'm thinking of a horror scifi. The machine projects another person's mind on to you but the result is that you just basically become that person and it takes years to recover. Since the only way to really bridge that gap would be to erase yourself.
So the mad scientist who just wants to be understood ends up turning everyone into someone who is, like her, tortured by a sense of isolation.
OK I've clearly gone on a tangent and I should be working on the book now anyways.
-
If there were a technology that would allow one to experience the thoughts of another person I wonder what we'd learn from those experiences?
Let's say you could produce a rough map of the state of a nervous system (not just the brain, thinking is a function of the whole body I suspect) and somehow transmit and remap it to another person. So you would feel and think for a moment some analog of another mind. Would it be Beautiful? Alienating?
Or is such a mapping impossible?
@futurebird @3TomatoesShort
One time, I woke up and just didn't feel like I was me. I felt OK, but it was very odd. I can't remember exactly what it did feel like, because being me is more or less the shape all my memories fit into.So it might be like that: temporary depersonalisation that you couldn't integrate into an experience you'd had, since you weren't really you at the time.