PhilipTheBucket
- 16 Posts
- 21 Comments
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
1·4 months agoYou define it in exactly the same way you just did. Completely fine, you have to do it for lots of things. It’s nice that Python can do that too.
Now, I’ll grab a random snippet of code from some random file from my source dir:
existing_bookmarks = db.session.execute( text('SELECT post_reply_id FROM "post_reply_bookmark" WHERE user_id = :user_id'), {"user_id": user_id}).scalars() reply = PostReply.query.filter(PostReply.id.in_(existing_bookmarks), PostReply.deleted == False).first() if reply: data = {"comment_id": reply.id, "save": True} with pytest.raises(Exception) as ex: put_reply_save(auth, data) assert str(ex.value) == 'This comment has already been bookmarked.'You can see some classes in use, which again is fine. But you also see inline instantiation of some reply JSON, a database returning a list of post_reply_id values without needing a special interface definition for returning multiple values, lots and lots of cognitive and computational load per line of code that’s being saved because the language features are saving people the heavy lifting of depending on user-defined classes for everything. It means you don’t have as many adventures through the code where you’re trying to modify a user-defined interface class, you don’t need as much strong typing, that kind of thing.
I would bet heavily that a lot of the things that are happening in that short little space of code, would need specific classes to get them done if the same project were getting implemented in some C+±derived language. Maybe not, I just grabbed a random segment of code instead of trying especially hard to find my perfect example to prove my point.
It is fine, there are significant weaknesses to Python too, I’m not trying to say “yay python it’s better for everything,” anything like that. I’m just saying that if you don’t get familiar with at least some language that does things more that way, and instead get solely accustomed to just user-defined classes or templates for every information exchange or functional definition, then you’ll be missing out on a good paradigm for thinking about programming. That’s all.
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
1·4 months agoComplex data structures are not “more of a C++ type of program structure”.
Oh, they are not at all. Equating complex data structures with user-defined data structures (in the form of classes and fields and whatnot), and using the latter as the primary method of storing and working with data (so that you’re constantly having to bring into your mental scope a bunch of different classes and how they need to interact), is 100% a C++ type of program structure. It’s pretty unusual in my experience in Python. Or, I mean, it’s perfectly common, but it’s not primary in the same universal way that it is in C++ and derivatives. It gets to exist as its own useful thing without being the only way. That’s what I am trying to say.
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
0·4 months agoIDK, I just have never really had this become a serious issue for me. I get what you mean, some actions are a little bit of a pain in the neck because people are often sloppy about typing, but literally the only time I can remember it being an issue at all has been when numpy is involved and so I have to figure out if something is a native Python thing or a numpy-fied custom structure.
I mean there’s just not that many types. Generally something is a list, a number, a map, or a string, and it’s pretty obvious which. Maybe there are OOP domain things where a lot of variables are objects of some kind of class (sort of more of a C++ type of program structure), and so it starts to become really critical to have strong type tools, I’m just saying I haven’t really encountered too much trouble with it. I’m not saying it’s imaginary, you may be right in your experience, I’m just saying I’ve worked on projects way bigger than a few hundred lines and never really had too much of an issue with it in practice in my experience.
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
41·4 months agoPlus I felt python was too new and would skip a lot of core programming skills id just like to know. Im not super interested in doing it the new way with all the helpers, or I wont feel like I learned anything.
Okay, you definitely want to learn C then. C# and C++ both add a ton of helpers. C# has a massive runtime environment that’s opaque and a little bit weird, and C++ has a massive compile-time environment that’s opaque and very weird. It’s sort of pick your poison. If you learn C and get skilled with it, you’ll be well set up for understanding what is actually going on and having strong fundamentals that will set you up well for whatever higher-level language you want to learn in the future.
Put another way: C# will hide just as many of the fundamentals and hardcore details from you as python will, it’ll just do it in a weird and counterintuitive fashion that will make it more confusing and with more weird C#-specific details.
I’d eventually like to learn unity as well so i decided on c#
I would actually just cut out the middleman and start with the Unity editor then. It actually might be a really good introduction to the nature of programming in general without throwing a bunch of extra nonsense at you, and in a really motivating format.
I do have the .net sdk and it seems to try to compile a simple program, it just throws errors even on an example program that shouldn’t have any. Im sure its something dumb.
What’s the program and what’s the error? I’m happy to help if something jumps out at me. I’m voicing my opinion otherwise on what might be better ways to attack this all in general, but I’m sure me or people here can help sort out the issues if you really want to take this approach and you’re just getting stuck on something simple.
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
26·4 months agoC# represents about 12% of the jobs out there.
https://www.devjobsscanner.com/blog/top-8-most-demanded-programming-languages/
It’s not unpopular, but it’s definitely not “massively” popular anymore. Different languages have different strengths and weaknesses, but C# has a few more weaknesses than most. In my opinion. Yes, there’s nothing wrong with learning any particular language you want to learn (and I’m a little surprised to see C++ has fallen significantly lower than C#, sure, fair enough). I’m just struggling to see an upside for learning it in the modern day (and now knowing more about what this person’s goal is I feel even more strongly that C# is the wrong answer for them. In my opinion.)
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
2·4 months agoYou can do strict typing in python if you want it, it’s very highly recommended if you’re doing a big project.
PhilipTheBucket@piefed.socialto
Programming@programming.dev•Best C# IDE/Compiler for Linux?English
310·4 months agoI really would not recommend specializing in C# at this point in computing history. You can do what you want obviously, but Python is much more likely to be what you want. C++ or Java might be okay if you want a job and are okay with a little bit dated / not ideal languages, or you could learn one of the proliferation of niche backend Linuxy languages, but C# has most of the drawbacks of C++ and Java without having even their relative level of popularity.
IDK what issue you’re having with VSCode, but I think installing the .NET SDK and then using
dotnetby hand from the command line, to test the install, might be a good precursor to getting it working in VSCode. But IDK why you would endeavor to do this in the first place.
Here’s the pinout for the webcam component: https://github.com/FrameworkComputer/Framework-Laptop-13/tree/main/Webcam
Unfortunately it isn’t really clear whether the switch positions are in the pinout because it’s the mainboard’s job to implement shutting off the camera when it’s off, or just as information with the webcam module responsible for shutting it off in hardware. I have no idea which it is, but it wouldn’t be super-hard for someone capable with EE to take off the bezel and fool around with it and see which it is (or just pay $19 for the magic of buying two of them, if you didn’t want to take apart your own laptop for it.)
They say they provide full schematics on demand to repair shops (https://knowledgebase.frame.work/availability-of-schematics-and-boardviews-BJMZ6EAu). I’m not sure why they don’t want to just post them publicly, so in that sense you might be right, but they also don’t seem like they are trying to keep them or the interface details of the webcam module fully top secret either.
They do seem like they publish enough information that someone could figure out the answer if they wanted to. (People in the forums have fooled around with them and seem to be convinced that they are actually hardware switches: https://community.frame.work/t/how-do-the-camera-and-microphone-switches-work/4271 IDK whether that’s accurate, but that’s what the forum people think.)
No idea why you’re trying to lecture me from this position of authority about taking apart PCBs and whatnot. Anyway, that’s how it works, hope this is helpful for you.
I sort of suspect that the wiring is in a diagram somewhere. I could be wrong, but that would be my guess. It’s not in a PCB, that’s up in the bezel where it’s just wires and stuff.
Framework laptops have a little physical switch to turn off the camera / mic when you don’t want them.
The original SGI webcams, some of the first that ever existed, actually had a physical plastic cover that you could slide over them when you didn’t want the camera on. “No, I don’t trust your hardware any more than your software. I shouldn’t need to. Stop looking at me when I don’t want you to, and prove to me that you are not, or else I will be suspicious.” Back in those days that was sort of a universal point of view among internet people, I think…
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•Please spare me from having to get in touch with that shit I wrote back thenEnglish
7·4 months agoI thought I had it worked out, how to sort of strike a balance so I can keep my focus intact and let it be helpful without wasting time constantly correcting its stuff or shying away from actually paying attention to the code. But I think my strategy of “let the LLM generate a bunch of vomit to get things started and then take on the correct and augmentation from a human standpoint” has let the overall designs at a high level get a lot sloppier than they used to be.
Yeah, you might be right, it might be time to just set the stuff aside except for very specialized uses.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•Please spare me from having to get in touch with that shit I wrote back thenEnglish
4·4 months agoCertainly possible
I’m also genuinely a little bit alarmed looking back now at my pre-LLM code and seeing the quality vs. the with-LLM code.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•Please spare me from having to get in touch with that shit I wrote back thenEnglish
11·4 months agoIDK, I just popped open a project from 10 years ago and it’s perfectly clean, it’s actually better than some of my modern code because it’s not LLM-ified to save time.
I think it has a lot more to do with whether it was made in that “kind of crappy IDK what I’m doing” phase of programming. Some of your old stuff is going to be in that category sure. As long as you’re out of that, however long it took you to get there or however far away it was in time, your code should be good.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•Please spare me from having to get in touch with that shit I wrote back thenEnglish
6·4 months agoYeah, that sounds about right lol. All my python projects for years were basically writing C in python. It actually took me all the way up until I got to look at the code ChatGPT likes to generate that I learned idiomatic python. My first database project was based on the Unix philosophy, where everything was strings (no ID keys, no normalization), because Unix is good.
The client wasn’t happy when they looked at the DB code lmao. Whatever, it worked, they still paid us and I didn’t do it again.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•Please spare me from having to get in touch with that shit I wrote back thenEnglish
35·4 months agoAm I the only one who likes looking at my old code? Generally I feel like it’s alright.
Usually the first project when I’m learning how to use some new language or environment is super-shitty. I can tell it’s very bad, usually I don’t like interacting with it if I have to make changes, but it’s still not overly painful. It’s just bad code. And that one exception aside I generally like looking at my code.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•You typical Node projectEnglish
4·4 months agoYeah. I feel like in a few years when literally nothing works or is maintainable, people are going to have a resurgent realization of the importance of reliability in software design, that just throwing bodies and lines of code at the problem builds up a shaky structure that just isn’t workable anymore once it grows beyond a certain size.
We used to know that, and somehow we forgot.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•You typical Node projectEnglish
10·4 months agoYeah. I have no idea what the answer is, just describing the nature of the issue. I come from the days when you would maybe import like one library to do something special like .png reading or something, and you basically did all the rest yourself. The way programming gets done today is wild to me.
PhilipTheBucket@piefed.socialto
Programmer Humor@programming.dev•You typical Node projectEnglish
202·4 months agoI sort of have a suspicion that there is some mathematical proof that, as soon as it becomes quick and easy to import an arbitrary number of dependencies into your project along with their dependencies, the size of the average project’s dependencies starts to follow an exponential growth curve increasing every year, without limit.
I notice that this stuff didn’t happen with package managers + autoconf/automake. It was only once it became super-trivial to do from the programmer side, that the growth curve started. I’ve literally had trivial projects pull in thousands of dependencies recursively, because it’s easier to do that than to take literally one hour implementing a little modified-file watcher function or something.
PhilipTheBucket@piefed.socialto
Open Source@lemmy.ml•Sonatype Uncovers Global Espionage Campaign in Open Source EcosystemsEnglish
6·4 months agoYeah, exactly. If you read the Snowden leaks to learn the details of what some of their actual capabilities are (smuggling flawed keys into the DH exchange for most major web browsers for example), it makes this stuff look like kids in their basements fucking around.


















Dude you were the one that asked the fucking question lol