sometimes a dragon

he/they, queer, furry, ζ, vegan

  • 0 Posts
  • 31 Comments
Joined 1 year ago
cake
Cake day: September 7th, 2024

help-circle







  • Ugh, I’m so fucking tired of this shit.

    I can imagine that an LLM can find bugs. Bugs often follow common patterns, and if anything, an LLM is a pattern matcher, so if you let it run on the whole world of open source code out there, I’m sure it’ll find some stuff, and some of it might be legit issues.

    But static code analysis tools have been finding bugs for decades, too. And now that an AI slop machine does it, it’s supposed to bring about dystopian sci-fi alien wars?

    Why are people hyped about that?

    (Also this poster makes wrong claims about every exploit being worth millions and such, but the rest of it is so much more ridiculous, it drowns out the wrongness of those claims.)






  • This could be regarded as a neat fun hack, if it wasn’t built by appropriating the entire world of open source software while also destroying the planet with obscene energy and resource consumption.

    And not only do they do all that… it’s also presented by those who wish this to be the future of all software. But for that, a “neat fun hack” just isn’t enough.

    Can LLMs produce software that kinda works? Sure, that’s not new. Just like LLMs can generate books with correct grammar inside, and vaguely about a given theme. But is such a book worth reading? No. And is this compiler worth using? Also no.

    (And btw, this approach only works with an existing good compiler as “oracle”. So forget about doing that to create a new compiler for a new language. In addition, there’s certainly no other language with as many compilers as C, providing plenty of material for the training set.)