Uncut_Lemon@lemmy.worldtoSelfhosted@lemmy.world•Experiences with zfs deduplication?English
1·
18 days agoYou better off enabling compression on a dataset.
Dedupe, even with the recent improvements, has huge overheads and will generally degrade in performance as the dataset increases in size, as it needs to keep track of the ‘routing’ table in RAM to redirect the request deduplicated blocks to the actual stored data. Apparently the latest openZFS release reduces the speeds loses over larger datasets, but it’s still subpar compared to compressed data
Video files are already heavily compressed, you’d be better off transcoding it to a more efficient media codec, like X265 or AV1, to save space on video files
My feelings exactly. Somehow Linux has managed to achieve so much with C. And running on all the major cloud providers, running missing critical apps. Shit we have Linux and BSD in space, running long term missions successfully.
The rust cult constantly seems to demand integration with the Linux kernel and being toxic about it, while actually contributing very little to achieve the interoperability, demanding the kernel Devs sort it out, or else…
I’m not a dev, it’s just how a lot of this drama reads.