Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse

Chebucto Regional Softball Club

myrmepropagandistF

futurebird@sauropods.win

@futurebird@sauropods.win
A forum for discussing and organizing recreational softball and baseball games and leagues in the greater Halifax area.
About
Posts
1.3k
Topics
248
Shares
3.7k
Groups
0
Followers
1
Following
0

Posts

Recent Best Controversial

  • Things my husband didn't know:
    myrmepropagandistF myrmepropagandist

    @imp3tuz

    OK but how do they make the IT departments in various companies and agencies and schools all start nagging everyone in the same way?

    Do they give them nagging lessons and call it "training" or something?

    Uncategorized

  • Things my husband didn't know:
    myrmepropagandistF myrmepropagandist

    @theplaguedoc

    We will get to it. Though both of us watch very little fiction it's kind of a crime really.

    I'm a Howl's Moving Castle girl.

    Uncategorized

  • Things my husband didn't know:
    myrmepropagandistF myrmepropagandist

    Things my husband didn't know:

    * what is grok(in the context of X)
    * That OpenAI used to be non-profit
    * What Studio Ghibli is.
    * The Studio Ghibli Controversy
    * What "Training Data" are

    Thought "everyone" knew this stuff. That's *my* bubble I get it.

    He's very up to date on news, just not this.

    Nonetheless his IT department keeps asking them to "learn copilot" so they can "be more efficient" everyone hates it. (Is Microsoft doing something to make IT departments do this all over the place?)

    Uncategorized

  • Maybe I should have been an evil hacker.
    myrmepropagandistF myrmepropagandist

    @llewelly

    It's always family, friends and coworkers who fail you with the social engineering isn't it?

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @david_chisnall @rep_movsd @GossiTheDog

    It's that trust that I'm talking about here. The process makes sense to me. But, I've also seen prompts that stump these things. I've seen prompts that make it spit out images that are identical to existing images.

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @david_chisnall @rep_movsd @GossiTheDog

    This is what I've learned by working with the public libraries I could find, and reading about how these things work.

    To really know if an image isn't in the training data (or something very close to it) we'd need to compare it to the training data and we *can't* do that.

    The training data are secret.

    All that (maybe stolen) information is a big "trade secret."

    So, when we are told "this isn't like anything in the data" the source is "trust me bro"

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @david_chisnall @rep_movsd @GossiTheDog

    This has always been possible, it was just slow. I think the innovation of these systems is building what amounts to search indexes for the atomized training data by doing a huge amount of pre-processing "training" (starting to think that term is a little misleading) this allows this kind of result to be generated fast enough to make it a viable application.

    Uncategorized

  • I was sent this photo¹ and it is breaking my brain
    myrmepropagandistF myrmepropagandist

    @SnoopJ

    It is related to this:
    https://www.booooooom.com/2016/05/09/bicycles-built-based-on-peoples-attempts-to-draw-them-from-memory/

    Uncategorized cycling biketooter

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @kevingranade @RustedComputing @rep_movsd @GossiTheDog

    "mammal brain"

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @rep_movsd @GossiTheDog

    "Its possible to put some guardrails on what the AI can be asked to do."

    How?

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @rep_movsd @GossiTheDog

    OK you came at me with "Because thats how the math works." a moment ago, yet *you* may think these programs are doing things they can't.

    'Intelligence working towards a reward' is a bad metaphor. (Why some see the apology, think it means something.)

    They will say "exclude X from influencing your next response" Or "tell me how you arrived at that result" and think, because an LLM will give a coherent-sounding response, it is really doing what they ask.

    It can't.

    Uncategorized

  • @josh0 @futurebird @GossiTheDog
    myrmepropagandistF myrmepropagandist

    @rep_movsd @josh0 @GossiTheDog

    "Whether or not mainstream AI has been trained on objectionable content cannot be proven by anyone except the law."

    "LLM doesnt need to be trained on such content to be able to generate them."

    First statement is true second is a wild guess. You don't know.

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @rep_movsd @GossiTheDog

    But you could only state that it could generate something not in the training data... if you knew what was in the training data. But that is secret. So you don't know. You don't know if there is a near identical image to the one produced in the training data.

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @rep_movsd @GossiTheDog

    There are things that these generators do well, and things that they struggle with, and things they simply can't generate. These limitations are set by the training data.

    It's easy to come up with a prompt for an engine that it just can't manage to do since it had nothing to reference.

    Uncategorized

  • Twitter generated child sexual abuse material via its bot..
    myrmepropagandistF myrmepropagandist

    @rep_movsd @GossiTheDog

    "LLM doesn't need to be trained on such content to be able to generate them."

    People say this but how do you know it is true?

    Uncategorized

  • I just want one pic of me in 2026 that’s half as flattering as this.
    myrmepropagandistF myrmepropagandist

    @batkaren

    this is scary

    Uncategorized

  • It's beginning to become a scarf
    myrmepropagandistF myrmepropagandist

    @sunumbral

    Did you spin this yarn? is it silk? it's so shiny and well defined.

    Uncategorized knitting

  • Do you think Sesame Street is actually independent or do they have a puppet government?
    myrmepropagandistF myrmepropagandist

    @faithisleaping

    As someone who basically LIVES on Sesame Street we are in NYC. But like many neighborhoods (and even floors of some buildings) there are local Mayors in addition to the Mayor of the city. So Oscar is in charge of the street.

    Uncategorized

  • Story Idea:
    myrmepropagandistF myrmepropagandist

    @johnlogic

    Maybe "Hypatia" ? IDK

    Uncategorized

  • Story Idea:
    myrmepropagandistF myrmepropagandist

    @johnlogic

    Kind of shocked they are still around.

    Uncategorized
  • 1
  • 2
  • 41
  • 42
  • 43
  • 44
  • 45
  • 63
  • 64
  • 43 / 64
  • Login

  • Don't have an account? Register

  • Login or register to search.
Powered by NodeBB Contributors
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups