I’m the administrator of kbin.life, a general purpose/tech orientated kbin instance.

  • 2 Posts
  • 839 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle
  • Going to second other comments. Even without archinstall. It feels like it will be harder than it is. Umm, just save yourself a bit of time and configure the network and install a console editor (nano/vim whatever) while in the chroot (if going full manual). It was a minor pain to work around that for me.

    There are pages discussing how to do everything (helps to have a laptop with browser, or a phone to look them up). At the end, you generally know exactly what you installed (OK no-one watches all the dependencies), and I’ve found any borks that happen easy to fix because I know what I installed.


  • r00ty@kbin.lifetolinuxmemes@lemmy.worldSnap...
    link
    fedilink
    arrow-up
    1
    ·
    8 hours ago

    I remember those times too. The difference today is that there are so many more libraries and projects use those libraries a lot more often.

    So using configure and make means that the user also has the responsibility of ensuring all those libraries are up to date. Which again if we’re talking about not using binary install, each also need a regular configure/make process too. It’s not that unusual for large packages to have dependencies on 100+ libraries. At which point building and maintaining the build for all of them yourself becomes untenable really. However I think gentoo exists to automate a lot of this while still building from source.

    I understand why binaries with references to other binary packages for prerequisites are used. I also understand where the limits of this are and why the AppImage/Flatpak/snaps exist. I just don’t particularly like the latter as a concept. But accept there’s times you might need them.




  • This does tally up with what I’ve been hearing. Where I’m at there’s been a few hires straight into senior. I’ve not heard of an official junior freeze. At the same time it’s been a long time since I’ve seen a new one.

    The problem, as I commented prior, is that if we no longer bring in junior devs to gain this kind of experience, we lose the flow of junior -> senior. But in most places, the people making the decisions won’t consider anything beyond the end of the current fin year.




  • I think it goes further than that. There’s two things happening with regard to AI and software development.

    1: Stack overflow has become less common as a resource to solve problems. This, as you say has a problem of input into LLMs for future problems to solve.
    2: Junior developers are being hired less because of AI. I assume the idea is that seniors will use AI in the same way they would usually use juniors. Except, they’ve done what business always does. Not think one bit about the future. Today’s senior developers are yesterdays junior developers.

    The combination of AI performance drop due to point 1, and the lack of new developers because of point 2 makes for potentially, a bad future for the profession.


  • We used to have it terrible in the UK in the 90s and 2000s. Basic ADSL was trialled in 1999 and available in maybe late 2000 I think. But it stagnated for a while.

    When it came to fibre, interesting things are happening. As well as the “national” (although privatised) telco installing it, there are many independent companies fitting it. Where I live I have the option of the official telco (1000/110) and a private company (1000/1000). Of course I chose the latter :P

    Some people have 3 or more options.

    Yeah in the future there might well be a handful of overall winners that vacuum up the losers and carve up the territory. But right now, it’s a good time for the normal people… At least for internet.

    EDIT: Just to add, some are ISPs and will only sell their own product. Some are wholesale, so even if they’re the only company in your area, you can often buy from multiple ISPs through them.











  • The way I read it, the developer wanted opt-out but it’s likely it will be opt-in. I’m find with opt-in and vehemently against opt-out for telemetry.

    I would prefer the information was statistical only. Rather than hostname (making the assumption they only want hostname to be able to somehow separate the data to follow changes over time), a much better idea would be some kind of hash based on information unlikely to change, but enough information that it would be unlikely possible to brute-force the original data out of the hash. So all they know is, this data came from the same machine, but cannot ID the machine. Maybe some kind of unique but otherwise untrackable unique ID is created at install time and ONLY used for this purpose and no other.