• 1 Post
  • 249 Comments
Joined 3 years ago
cake
Cake day: June 16th, 2023

help-circle

  • Zarxrax@lemmy.worldtoProgrammer Humor@programming.devVibe Coding
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    9 months ago

    I consider myself a bad hobbyist programmer. I know a decent bit about programming, and I mainly create relatively simple things.

    Before LLMs, I would spend weeks or months working on a small program, but with LLMs I can often complete it significantly faster.

    Now, I don’t suppose I would consider myself to be a “vibe coder”, because I don’t expect the LLM to create the entire application for me, but I may let it generate a significant portion of code. I am generally coming up with the basic structure of the program and figuring out how it should work, then I might ask it to write individual functions, or pieces of functions. I review the code it gives me and see if it makes sense. It’s kind of like having an assistant helping me.

    Programming languages are how we communicate with computers to tell them what to do. We have to learn to speak the computer’s language. But with an LLM, the computer has learned to speak our language. So now we can program in normal English, but it’s like going through a translator. You still have to be very specific about what the program needs to do, or it will just have to guess at what you wanted. And even when you are specific, something might get lost in translation. So I think the best way to avoid these issues is like I said, not expecting it to be able to make an entire program for you, but using it as an assistant to create little parts at a time.



  • I’m not against such a law in theory, but I have many questions about how it would be implemented and enforced. First off, what exactly counts as AI generated? We are seeing more and more that AI features are being added into lots of areas, and I could certainly envision a future in few years time that nearly all photos taken with high end phones would be altered by AI in some way. After that, who exactly is responsible for ensuring that things are tagged properly? The individual who created the image? The software that may have done the AI processing? The social media site that the image was posted on? If the penalties are harsh for not attributing ai to an image, what’s to stop sites from just having a blanket disclaimer saying that ALL images on the page were generated by AI?




  • AI often gets painted as people vs businesses, but that’s not necessarily what it is in many cases. The EFF is arguing for fair use, which is something that they have stood for as long as I can remember. As the article argues, the businesses creating AIs can easily abide by this law, it’s the little guys training things that would be impacted the most.











  • Graduated around 2008 as the economy was crashing, and struggled to find anything. Eventually got hired at a call center for a large company (about a year after I had applied for it). After a few years of that I was able to transfer into a different department where I didn’t have to deal with customers directly, then kept getting promoted to different positions until I found one I was really comfortable in.