the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
-
@miki @KatS if i memorize every possible answer to a specific test, i can pass too. doesn't mean i know shit about fuck.
There's no actual thinking or reasoning involved (and no, reasoning models don't actually "reason"), so yeah, an LLM isn't actually intelligent, it just shows how flawed our tests for intelligence are.
To get some actual intelligence, thinking or reasoning involved, I'd reckon we'd have to fundamentally change something in the architecture of LLMs, and use a fuckton more computing resources for a single model, and considering how much energy the current tech already wastes, and the whole shtick that made LLMs (and more broadly generative AI) work in the first place is "we discovered that there comes a point where the output gets better when we throw rediculous amounts of compute resources on the problem", and it's already getting super difficult to run and maintain.
Honestly, either you're unreasonably optimistic, or you've never taken a look at how things actually work under the hood, but I really recommend you to take a closer look at the technology you praise so much.
A couple things you could take a look at (without an AI summarizer, otherwise you'd learn jack shit):
Attention is all you need, which is the paper that sparked all that AI craze and the development of GPT models and The Illusion of Thinking: Understanding the Strengths and Limitations of Reasoning Models via the Lens of Problem Complexity
, which takes a closer look and tests reasoning models to infer strengths and weaknesses of reasoning models with all sorts of levels in problem complexity.Honestly, before you make any claims about where the tech could be and what it could do, you should have a look at how things actually work under the hood and have a rough idea of how things work, otherwise, no offense, you're just talking out of your arse.
@lucydev @KatS I have very specifically said "unseen questions."
If memorizing answers was a viable strategy to pass that test, humans would have done so.
If you still believe that there's no possible use for a tool that can get gold on a never-before-used set of math olympiad question given a few hours of access to a reasonably powerful computer, and that the existence of that tool will have no interesting impact on the world... I don't know what to tell you.
-
@lucydev @KatS I have very specifically said "unseen questions."
If memorizing answers was a viable strategy to pass that test, humans would have done so.
If you still believe that there's no possible use for a tool that can get gold on a never-before-used set of math olympiad question given a few hours of access to a reasonably powerful computer, and that the existence of that tool will have no interesting impact on the world... I don't know what to tell you.
@miki @KatS > If you still believe that there's no possible use for a tool that can get gold on a never-before-used set of math olympiad question given a few hours of access to a reasonably powerful computer, and that the existence of that tool will have no interesting impact on the world...
How reliable is that source? And if that's true, is it really reasonable to bet everything on this, and let this do all your work when a) you end up completely dependent on the tech and b) utterly destroy the environment in that process?
Real world problems may be less complex but might require much more context.
Oh, and don't get me started on accountability. There's a reason why curl is closing their bug bounty program.
-
@miki @KatS > If you still believe that there's no possible use for a tool that can get gold on a never-before-used set of math olympiad question given a few hours of access to a reasonably powerful computer, and that the existence of that tool will have no interesting impact on the world...
How reliable is that source? And if that's true, is it really reasonable to bet everything on this, and let this do all your work when a) you end up completely dependent on the tech and b) utterly destroy the environment in that process?
Real world problems may be less complex but might require much more context.
Oh, and don't get me started on accountability. There's a reason why curl is closing their bug bounty program.
-
@miki @KatS > If you still believe that there's no possible use for a tool that can get gold on a never-before-used set of math olympiad question given a few hours of access to a reasonably powerful computer, and that the existence of that tool will have no interesting impact on the world...
How reliable is that source? And if that's true, is it really reasonable to bet everything on this, and let this do all your work when a) you end up completely dependent on the tech and b) utterly destroy the environment in that process?
Real world problems may be less complex but might require much more context.
Oh, and don't get me started on accountability. There's a reason why curl is closing their bug bounty program.
@lucydev @KatS Curl is closing their bug bounty program because it's far too easy to use LLMs to produce slop. It doesn't mean you can't use LLMs to produce non-slop, just that it is a technique some people have found to get money with not too much effort, and we haven't yet sufficiently adapted to it. This is a genuine problem.
-
the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
ai didn't democritize any of these things. People did. The internet did. if all these things weren't democritized and freely available on the internet before, there wouldn't have been any training data available in the first place.
the one single amazing thing that today's day and age brought us is, that you can learn anything at any time for free at your own pace.
like, you can just sit down, and learn sketching, drawing, programming, writing, basics in electronics, pcb design, singing, instruments, whatever your heart desires and apply and practice these skills. fuck, most devs on fedi are self taught.
the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away.
(not to mention said tech failing at it miserably)
I think it's accurate
Instead of building your own skill, control someone else's
Sure they didn't _consent_, but democracies don't ask opposition voters for consent.
It's an accurate analogy and shows why democracy isn't a good thing ðĪŠ
-
alr the sentence "the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away." goes so hard imma drop it in my bio now
-
the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
ai didn't democritize any of these things. People did. The internet did. if all these things weren't democritized and freely available on the internet before, there wouldn't have been any training data available in the first place.
the one single amazing thing that today's day and age brought us is, that you can learn anything at any time for free at your own pace.
like, you can just sit down, and learn sketching, drawing, programming, writing, basics in electronics, pcb design, singing, instruments, whatever your heart desires and apply and practice these skills. fuck, most devs on fedi are self taught.
the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away.
(not to mention said tech failing at it miserably)
@lucydev indeed, it's "easy" to "democratize" if you just put a cute bow on top of something already built by others then HIDE that they did it.
Also no democratization is possible when one relies on a black box controlled by others, so even if the technology itself was fine (which it's not IMHO) then at least most if not all commercializations of it are huge red flags trying to establish dependency.
-
F myrmepropagandist shared this topic
-
the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
ai didn't democritize any of these things. People did. The internet did. if all these things weren't democritized and freely available on the internet before, there wouldn't have been any training data available in the first place.
the one single amazing thing that today's day and age brought us is, that you can learn anything at any time for free at your own pace.
like, you can just sit down, and learn sketching, drawing, programming, writing, basics in electronics, pcb design, singing, instruments, whatever your heart desires and apply and practice these skills. fuck, most devs on fedi are self taught.
the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away.
(not to mention said tech failing at it miserably)
It's also condescending, insulting, to disabled people to suggest that if some of them, IDK, struggle with a paint brush what is needed is for the computer to draw it for them rather than for all of us to look and listen with more care to the work that they create.
1/
-
It's also condescending, insulting, to disabled people to suggest that if some of them, IDK, struggle with a paint brush what is needed is for the computer to draw it for them rather than for all of us to look and listen with more care to the work that they create.
1/
I heard a guy say AI could "make art more diverse" and he had all these images of black elves and dwarfs. As if "lacking diversity" were just a surface issue not one built into who gets to participate, who has the time for creative expression.
As if just pasting in a different colored face were the same thing as having an artist who wanted to draw that diversity and whose work would emerge from and improved by the culture and experiences of the creator.
2/
-
I heard a guy say AI could "make art more diverse" and he had all these images of black elves and dwarfs. As if "lacking diversity" were just a surface issue not one built into who gets to participate, who has the time for creative expression.
As if just pasting in a different colored face were the same thing as having an artist who wanted to draw that diversity and whose work would emerge from and improved by the culture and experiences of the creator.
2/
These AI as equity arguments aren't coming from people who have ever said anything about "equity" before this moment, and they will never say anything about equity after this moment. They don't really care about equity. They just want to have something to say that might pause our criticism.
"What if it really could help people?"
Let that go. If it could help people you'd see people using AI effectively to help people.
They are using it as cover.
3/3
-
-
It democratizes it by making it available for the people who can't / don't want to / don't have the time for learning it.
No, I'm sorry, but it doesn't.
What it "democratises" is being an art director who commissions a machine to generate things derived from the (uncredited, un-compensated) work of others (whose lack of consent was gleefully violated).
Gutenberg democratised learning, with his movable-type press.
Encylopaedias took that a step further, and Wikipedia amped it up again.
Blogs and Youtube democratised the sharing of knowledge and skills.
All these things have enabled people to learn how to do a thing.But if you typed in a description and got a picture in return, you did not create that picture. You commissioned it.
-
@lucydev I see putting a prompt into AI and hoping that the generated code is correct as a bad idea, especially in complex apps that have long-term maintainability considerations, or when security / money / lives are at stake.
For throwaway projects (think "secret santa style gift exchange for a local community with a few extra constraints, organized by somebody with 0 CS experience", vibe coding is probably fine.
For professional developers, LLMs can still be pretty useful. Even if you have to review the code manually, push back on stupidity, and give it direction on how to do things, not just what to do (which is honestly what I do for production codebases), it's still a force multiplier.
@miki @lucydev So it's democratizing code by giving many people the ability to do it badly.
There is a real question here - something related to the balance of
- harm to society and individuals of allowing enormous quantities of badly done things to be created, producing circumstances where actual expertise, understanding, and competence are devalued to the point of no longer being viable careers
- benefit to society and individuals of enabling recognition of genius in a few people who would not otherwise have access to a given audience or skill.
I am broadly in favour of enabling experts and competent individuals to earn a living from their expertise and competence. I am also broadly in favour of enabling everyone to develop whatever abilities and interests they have.
But people with ideas for software lack time, skill, and resources, not access. If there's a demand, why is there no business writing throwaway apps for people? Will AI really be cheaper in the long run?
-
the whole ai-bro shtick about "ai democritizes art/programming/writing/etc" seemed always so bs to me, but i couldn't put it into words, but i think i now know how.
ai didn't democritize any of these things. People did. The internet did. if all these things weren't democritized and freely available on the internet before, there wouldn't have been any training data available in the first place.
the one single amazing thing that today's day and age brought us is, that you can learn anything at any time for free at your own pace.
like, you can just sit down, and learn sketching, drawing, programming, writing, basics in electronics, pcb design, singing, instruments, whatever your heart desires and apply and practice these skills. fuck, most devs on fedi are self taught.
the most human thing there is is learning and creativity. the least human thing there is is trying to automate that away.
(not to mention said tech failing at it miserably)