How is Samba supposed to be a replacement for the entirety of Windows Server???
How is Samba supposed to be a replacement for the entirety of Windows Server???
Debian-based distros (and probably most othera as well) actually have a package called “intel-microcode” which gets updated fairly regularly.
I search for stuff in qBittorrent and download it directly onto my home server using the web UI. I’ve got most of my family’s devices set up to be able to access it either via an NFS or SMB mount, and then it’s just a simple matter of opening the corresponding video in VLC.
My new laptop doesn’t support S3 sleep, it can drain the battery from 100% to 0% in less than 16 hours while supposedly “sleeping”.
GeForce GT 610.
It was the cheapest GPU available at the time, imagine my disappointment when I tried to run Minecraft with shaders and barely got more than a slideshow.
Traditional graphics code works by having the CPU generate a sequence of commands which are packed together and sent to the GPU to run. This extension let’s you write code which runs on the GPU to generate commands, and then execute those same commands on the GPU without involving the CPU at all.
This is a super powerful feature which makes it possible to do things which simply weren’t feasible in the traditional model. Vulkan improved on OpenGL by allowing people to build command buffers on multiple threads, and also re-use existing command buffers, but GPU pipelines are getting so wide that scenes containing many objects with different render settings are bottlenecked by the rate at which the CPU can prepare commands, not by GPU throughput. Letting the GPU generate its own commands means you can leverage the GPU’s massive parallelism for the entire render process, and can also make render state changes much cheaper.
(For anyone familiar, this is basically a more fleshed out version of NVIDIA’s proprietary NV_command_list extension for OpenGL, except that it’s in Vulkan and standardized across all GPU drivers)
JetBrains IDEs.
Not the same person, but in my case I’m 182cm and my waist is 76cm. If I were 40cm shorter I’d actually (barely) be in the green area!
Conjunction Junction (What’s your Function)
“Better” in the sense that it actually has the ability to check for corruption at all, as all metadata and data are checksummed.
25% of millions of people is still many people, they didn’t say “a majority of people”.
You’ve made me uncertain if I’ve somehow never noticed this before, so I gave it a shot. I’ve been dd
-ing /dev/random
onto one of those drives for the last 20 minutes and the transfer rate has only dropped by about 4MB/s since I started, which is about the kind of slowdown I would expect as the drive head gets closer to the center of the platter.
EDIT: I’ve now been doing 1.2GB/s onto an 8 drive RAID0 (8x 600GB 15k SAS Seagates) for over 10 minutes with no noticable slowdown. That comes out to 150MB/s per drive, and these drives are from 2014 or 2015. If you’re only getting 60MB/s on a modern non-SMR HDD, especially something as dense as an 18TB drive, you’ve either configured something wrong or your hardware is broken.
This is for very long sustained writes, like 40TiB at a time. I can’t say I’ve ever noticed any slowdown, but I’ll keep a closer eye on it next time I do another huge copy. I’ve also never seen any kind of noticeable slowdown on my 4 8TB SATA WD golds, although they only get to about 150MB/s each.
EDIT: The effect would be obvious pretty fast at even moderate write speeds, I’ve never seen a drive with more than a GB of cache. My 16TB drives have 256MB, and the 8TB drives only 64MB of cache.
My 16TB ultrastars get upwards of 180MB/s sustained read and write, these will presumably be faster than that as the density is higher.
not sure what you’re on about, i have some cheap 500GB USB 3 drives from like 2016 lying around and even those can happily deal with sustained writes over 130MB/s.
Okay, but the commenter said “my laptop with jts integrated GPU”. Obviously, laptops with a dedicated AMD GPU would be affected by this change.
Wow look at mister long dong over here reaching all the way into the water
It specifically says the change only applies to dedicated GPUs, not integrated ones.
The longer I look at this the more uncertain I am whether or not this is AI.
The fingers look weirdly long, but all of the text is actually written and oriented correctly, but the shading across the surface of all the cards seems to change brightness randomly, …