BTRFS RAID5/6 is fine as long you don’t run into a scenario where your machine crashes and there was still unwritten data in the cache. Also write performance sucks and scrubbing takes an eternity.
(_____(_____________(#)~~~~~~
BTRFS RAID5/6 is fine as long you don’t run into a scenario where your machine crashes and there was still unwritten data in the cache. Also write performance sucks and scrubbing takes an eternity.
Gallium-Nine also tends to be buggy if used with 32-bit software in particular. All the 32-bit games I’ve tried have problems with it. They usually work fine for the first 30-60 minutes and after that the framerate becomes unstable to the point where the game becomes unplayable. It happens consistently with Gallium-nine but not at all with DXVK.
Says who? Is this shit really a race now? lol.
I didn’t realize how much worse YouTube has gotten since I last used it in 2019. Still, through Invidious I’ve been noticing that the comments on popular videos have gotten weird, especially in recent years. Also it seems like YouTube is deleting any comment that is even remotely negative. Because all I ever see anymore are generic positive praise comments. Meanwhile there are content farms out there that put out videos for “Kids” on a rapid pace that contain borderline sexual content. I wish more people would start using PeerTube because I have a feeling that YouTube won’t be getting any better in the future.
incognito mode in chrome is little more than the illusion of being logged off.
Which shouldn’t be a surprise since Chrome itself is just another one of Google’s spyware products.
Back when the conflict started for the first few weeks here (Central Europe) on TV every program that featured Ukraine soldiers showed them with brown scotch tape on their upper arms. After a while they no longer gave a shit. Probably because they thought that the Wolfsangel would be unique enough for the commoners.
Westmere Xeon processors are still quite OK imo. I have an old enterprise machine with one. 12 Threads, 2.6 GHz is still quite usable for many things. I mostly use it to compile larger software. But personally I’d argue that Longsoon is already far better than Intel/AMD since Longsoon is based on MIPS, which is based on RISC, while Intel/AMD still clings to their bloated and way too power hungry CISC crap. Plus today most performance comes from parallelism and cache size rather than core frequency and Longsoon does already have 128 and 256-bit vector instructions in their ISA, which is pretty decent. Maybe they can figure out a 512-bit vector extension that doesn’t severely throttle the CPU when using it before Intel can, lol.
US Imperials about to seethe
I have the same experience. I wrote a simple program with SDL2 to test a software renderer. All it does is create a window then go into an event loop and after each iteration it streams a framebuffer to a texture that gets displayed in the window. In the default mode (X11) my frame timings fluctuate a lot and for a while I tried to massage the code to get it stable because I was convinced that it was just my draw code. Then I eventually forced SDL2 to use Wayland and not only did the draw time per frame go down by 2ms but the fluctuations went away completely.
It’s depressing to see so many AI powered FOSS projects that are basically just a chatbot/autocomplete or something that can spit out some images, when there are so many cool things you could use Neural nets for. For instance, as a FLOSS enthusiast a tool to help with reverse engineering proprietary binaries, specifically firmware and driver blobs would be awesome and could permanently change computing for the better. But everyone in the west seems to be more concerned with how they can use Neural nets to reduce production costs and increase profits.
Got a link to the article?
Nice. Go home imperials
ComfyUI seems like the most promising but it also uses ROCm/CUDA which don’t officially support any of my current GPUs (models load successfully but midway through computing it fails). Why can’t everyone just use compute shaders lol.
I’m pretty sure I tried that one but it kept running out of VRAM. Also it utilizes proprietary AMD/NVidia software stacks which are a pain to set up. GPT4ALL is a lot better in that regard, they just use Vulkan compute shaders to run the models.
On the topic of GPT4ALL, I’m curious is there an equivalent of that that but for txt2img/img2img models? All the FOSS txt2img stuff I’ve tried so far is either buggy (some of the projects I tried don’t even compile), require a stupid amount of third party dependencies, are made with NVidia hardware in mind while everyone else is second class or require unspeakable amounts of VRAM.
Kepler cards work “OK” with nouveau. What sucks is that reclocking has to be done manually, video decoding/encoding requires firmware blobs and OpenGL support tends to be meh. Overall it’s an unstable experience. I have a stack of Kepler based cards that would still be usable if Linux/mesa had a decent driver.
XMPP is cool but so many things that you’d expect to be standard are extensions that both the Server and all the Clients need to have installed and enabled. Also some XMPP clients don’t support all extensions and some extensions also require third party software and extra setup. Matrix just works.
That being said signing up to matrix.org is cringe. Absolutely host your own homeserver.
I found the best strategy for reddit right now (besides not using it) is to go for tiny, less obvious subs where the average post has at most a double digit upvote count and a handful of comments. Anything bigger gets dangerous since it may get ranked higher in r/all
Or the buggy Bloom effect in Cities Skylines, Stellaris and Surviving Mars that would cause flicker and a weird black screen. Pretty sure they never bothered to fix that.
That’s literally what I’m saying; It’s fine as long as there wasn’t any unwritten data in the cache when the machine crashes/suddenly loses power. RAID controllers have a battery backed write cache for this reason, because traditional RAID5/6 has the same issue.