darktable 5.0.0 released
darktable 5.0.0 released
We’re proud to announce the new feature release of darktable, 5.0.0! The github release is here: https://github.com/darktable-org/darktable/releases/tag/release-5.0.0.Pascal Obry (darktable)
like this
themadcodger, ☆ Yσɠƚԋσʂ ☆ and olorin99 like this.
reshared this
Tech Cyborg reshared this.
Forget Chrome—Google Starts Tracking All Your Devices In 8 Weeks
Forget Chrome—Google Starts Tracking All Your Devices In 8 Weeks
Digital fingerprinting is suddenly back and it will be everywhere—here's what you need to know.Zak Doffman (Forbes)
Bird flu update: Maps show states most affected
Bird flu update: Maps show states most affected
The bird flu outbreak has spread to all 50 states, infecting dairy cattle, poultry farms and 61 humans across the country., USA TODAY (USA TODAY)
Is using an HDD with an SSD as cache on Linux a good idea?
I currently have a 1 TiB NVMe drive that has been hovering at 100 GiB left for the past couple months. I've kept it down by deleting a game every couple weeks, but I would like to play something sometime, and I'm running out of games to delete if I need more space.
That's why I've been thinking about upgrading to a 2 TiB drive, but I just saw an interesting forum thread about LVM cache. The promise of having the storage capacity of an HDD with (usually) the speed of an SSD seems very appealing, but is it actually as good as it seems to be?
And if it is possible, which software should be used? LVM cache seems like a decent option, but I've seen people say it's slow. bcache is also sometimes mentioned, but apparently that one can be unreliable at times.
Beyond that, what method should be used? The Arch Wiki page for bcache mentions several options. Some only seem to cache writes, while some aim to keep the HDD idle as long as possible.
Also, does anyone run a setup like this themselves?
...depends what your use pattern is, but I doubt you'd enjoy it.
The problem is the cached data will be fast, but the uncached will, well, be on a hard drive.
If you have enough cached space to keep your OS and your used data on it, it's great, but if you have enough disk space to keep your OS and used data on it, why are you doing this in the first place?
If you don't have enough cache drive to keep your commonly used data on it, then it's going to absolutely perform worse than just buying another SSD.
So I guess if this is 'I keep my whole steam library installed, but only play 3 games at a time' kinda usecase, it'll probably work fine.
For everything else, eh, I probably wouldn't.
Edit: a good usecase for this is more the 'I have 800TB of data, but 99% of it is historical and the daily working set of it is just a couple hundred gigs' on a NAS type thing.
I used to run an HDD with an SSD cache. It's deffo not as fast as a normal SSD. NVMe storage is also very cheap. You can get a 2tb NVMe for the same price as SATA.
In all honesty, I'd just keep things simple and go for a SSD.
Apple tried it a decade ago. It was called the Fusion Drive. It performed about as well as you’d expect. macOS saw the combined storage, but the hardware and OS managed the pair as a single unit.
If there’s a good tiered storage daemon on your OS of choice, go for it!
Currently have 2 1tb NVME's over around 6 tb of HDDs, works really nice to keep a personal steam cache on the HDD's in case I pick up an old game with friends, or want to play a large game but only use part of it (ie cod zombies).
Also is super helpful for shared filesystem's (syncthing or NFS), as its able to support peripheral computers a lot more dynamically then I'd ever care to personally configure. (If thats unclear, I use it for a jellyfin server, crafty instance, some coding projects - things that see heavy use in bursts, but tend to have an attention lifespan).
Using bcachefs with backups myself, and after a couple months my biggest worry is the kernel drama more than the fs itself
like this
TVA likes this.
Back when SSDs were expensive and tiny they used to sell hybrid drives which were a normal sized HDD with a few gigs of SSD cache built in. Very similar to your proposal. When I upgraded from a HDD to a hybrid it was like getting a new computer, almost as good as a real SSD would have been.
I say go for it.
If it's all Steam games then you could just move games around as needed, no need for a fancy automatic solution.
I haven't, I'll try that
EDIT: I've tried it and it had little effect (< 1 GiB)
L2ARC is not a read cache in the conventional sense, but something closer to swap for disks only. It is only effective if your ARC hit rate is really low from memory constraints, although I’m not sure how things stack up now with persistent L2ARC. ZFS does have special allocation devices, though, where metadata and optionally small blocks of data (which HDDs struggle with) can go, but you can lose data if these devices fail. There’s also the SLOG, where sync writes can go. It’s often useful to use something like optane drives for it.
Personally, I’d just keep separate drives. A lot of caching methods are afterthoughts (bcache is not really maintained as Kent is now working on bcachefs) or, like ZFS, are really complex are not true readback/writeback caches. In particular, LVM cache can, depending on its configuration, lead to data loss if a cache device is lost, and LVM itself can occur some overhead.
Flash is cheap. A 2TB NVMe drive is now roughly the cost of 2 AAA games (which is sad, really). OP should just buy a new drive.
L2ARC only does metadata out of the box. You have to tell it to do data & metadata. Plus for everything in L2ARC there has to be a memory page for it. So for that reason it’s better to max out your system memory before doing L2ARC.
It’s also not a cache in the way that LVMCACHE and BCACHE are.
At least that’s my understanding from having used it on storage servers and reading the documentation.
You can do it but I wouldn't recommend it for your use-case.
Caching is nice but only if the data that you need is actually cached. In the real world, this is unfortunately not always the case:
- Data that you haven't used it for a while may be evicted. If you need something infrequently, it'll be extremely slow.
- The cache layer doesn't know what is actually important to be cached and cannot make smart decisions; all it sees is IO operations on blocks. Therefore, not all data that is important to cache is actually cached.
Block-level caching solutions may only store some data in the cache where they (with their extremely limited view) think it's most beneficial. Bcache for instance skips the cache entirely if writing the data to the cache would be slower than the assumed speed of the backing storage and only caches IO operations below a certain size.
Having data that must be fast always stored on fast storage is the best.
Manually separating data that needs to be fast from data that doesn't is almost always better than relying on dumb caching that cannot know what data is the most beneficial to put or keep in the cache.
This brings us to the question: What are those 900GiB you store on your 1TiB drive?
That would be quite a lot if you only used the machine for regular desktop purposes, so clearly you're storing something else too.
You should look at that data and see what of it actually needs fast access speeds. If you store multimedia files (video, music, pictures etc.), those would be good candidates to instead store on a slower, more cost efficient storage medium.
You mentioned games which can be quite large these days. If you keep currently unplayed games around because you might play them again at some point in the future and don't want to sit through a large download when that point comes, you could also simply create a new games library on the secondary drive and move currently not played but "cached" games into that library. If you need it accessible it's right there immediately (albeit with slower loading times) and you can simply move the game back should you actively play it again.
You could even employ a hybrid approach where you carve out a small portion of your (then much emptier) fast storage to use for caching the slow storage. Just a few dozen GiB of SSD cache can make a huge difference in general HDD usability (e.g. browsing it) and 100-200G could accelerate a good bit of actual data too.
According to firelight I have 457 GiB in my home directory, 85 GiB of that is games, but I also have several virtual machines which take up about 100 GiB. The /
folder contains 38 GiB most of which is due to the nix store (15 GiB) and system libraries (/usr
is 22.5 GiB). I made a post about trying to figure out what was taking up storage 9 months ago. It's probably time to try pruning docker again.
EDIT: ncdu
says I've stored 129.1 TiB lol
EDIT 2: docker and podman are using about 100 GiB of images.
I also have several virtual machines which take up about 100 GiB.
This would be the first thing I'd look into getting rid of.
Could these just be containers instead? What are they storing?
nix store (15 GiB)
How large is your (I assume home-manager) closure? If this is 2-3 generations worth, that sounds about right.
system libraries (/usr
is 22.5 GiB).
That's extremely large. Like, 2x of what you'd expect a typical system to have.
You should have a look at what's using all that space using your system package manager.
EDIT:ncdu
says I've stored 129.1 TiB lol
If you're on btrfs and have a non-trivial subvolume setup, you can't just let ncdu
loose on the root subvolume. You need to take a more principled approach.
For assessing your actual working size, you need to ignore snapshots for instance as those are mostly the same extents as your "working set".
You need to keep in mind that snapshots do themselves take up space too though, depending on how much you've deleted or written since taking the snapshot.
btdu
is a great tool to analyse space usage of a non-trivial btrfs setup in a probabilistic fashion. It's not available in many distros but you have Nix and we have it of course ;)
Snapshots are the #1 most likely cause for your space usage woes. Any space usage that you cannot explain using your working set is probably caused by them.
Also: Are you using transparent compression? IME it can reduce space usage of data that is similar to typical Nix store contents by about half.
What I personally have is an NVME SSD for the games that need the maximum performance and/or I play a lot, and a slower SATA SSD for the other games where it doesn't really matter.
Also, depending on the games you are playing, many expect to be able to stream assets and other data at SSD speeds, so your experience might be really bad.
I used to do this all the time! So in terms of speed bcache is the fastest, but it’s not as well supported as lvm cache. IMHO lvm cache is plenty fast enough for most uses.
Is it going to be as fast as a NVME ssd? Nope. But it should be about as fast as a SATA ssd if not a little slower depending on how it’s getting the data. If you’re willing to take that trade off it’s worth it. Though anything already cached is going to be accessed at NVME speeds.
So it’s totally worth it if you need bigger storage but can’t afford the SSD. I would go bigger in your HDD though, if you can. Because unless you’re accessing more than the capacity of your SSD frequently; the caching will work extremely well for both reads and writes. So your steam games will feel like they’re on a SSD, most of the time, and everything else you do will “feel” snappy too.
So, having a cache drive of around 10% of the main drive seems like a good size to cost compromise. Having a cache 50% size of the basic storage feels like a waste to me.
I always somehow miss when it stops working and by the time I go to YouTube again it's already working again.
Blessings to Gorhill
Yeah I don't really care about value when it comes to giving money to the guys who work with the NSA and CIA to find ways to more thoroughly spy on every user 24/7, and turned every search into "You asked for x, here's a dozen pages of what the State Department thinks you should have searched for instead"
Not to mention their genocide profiteering: mintpressnews.com/project-nimb…
mintpressnews.com/national-sec…
I'd like to think they're advising on how to keep ISIS propaganda, gore / executions, child endangerment, etc, from popping up on clearnet results....
...but * sigh * , former (not ex, let's be honest lol) spooks...so, why wouldn't it include some kind of pro-employer propaganda plan, right?
I exaggerated for effect, in the way that 99% sure might as well be a fact in this case:
I have never given them to YouTube, and they have no financial incentive to acquire them AFAIK - holding that kind of PI is a liability so if anything they wouldn't want it without having a need for it. YouTube can't even know what countries I live in, my digital identity from the POV of their servers is too fluid and non-unique for my viewing habits to meaningfully correlate; I blend in with many other people also trying to stay hidden from them.
As for other Alphabet companies, like those engaged in surveillance capitalism who want to scoop up all of the datas, it's theoretically possible they've illegally acquired them from third parties and found a use for it, but there's just no feasible way they could associate that with most of my online activities, say, this account I'm using. The only people who have a chance at that are certain state intelligence agencies who are eavesdropping the wires, and they have much bigger problems they're paid to worry about. Hell, unless things have gotten better for them since Snowden, even they might struggle - most of their super cool hacker shit is only really useful if someone's worth active targeting.
Data gathering/brokering and payment information security are not really connected. PCI compliance standards are well standardized and fairly strict.
I would trust Google to handle payment information securely over any ‘media’ company.
If personal data was regulated at even the fraction of what payment data goes through we would all be better off.
I used to pay for it, for the same reasons. They stopped taking my money, i don't know why, and I noticed zero change in the quality of the service.
I'm paying for other google services, so I don't know why youtube specifically stopped. Oh well.
Yup.
Family plan is 22$ for ad free yt and music subscription for 5 or 6 people
I get that Google bad an all that, but it's a good deal
Google bad
, $22 is a bad deal for me. I'd rather donate $20 to the groups helping us get around it, and spend the other $2 on jawbreakers!
NewPipe works perfectly for me for years now. I even use the Sponsorblock Fork to skip the sponsored segments from the Video Creators
What these morons don't count on is that everyone actually hates the technology deep down. We don't want it! But it gives us a dopamine hit. And when they stack on subscription prices and lock up content and shove ads down our throat... well, the dopamine stops hitting and just get pissed. So we leave.
yes video quality has dropped, video suggestion algorithms have become a weird uroboric/echo chamber even if you have dozens of subscriptions, and the YouTube shorts reel refuses to be trained (no matter what I do, if I dislike every video I don't want to see and like all the ones I do want to see and log off if it suggests too many bad videos in a row, it still feeds me an endless loop of unwanted brain rot after 5 or 6 scrolls). I hate YouTube.
At the same time, they've found a good way around the ad block situation which is to promote ads as thumbnails on your "for you" video main page. I don't know why they didn't just do that in the first place, because honestly I don't mind that. It's when they constantly interrupt my videos ever freaking minute and a half that I start to get pissed.
Could be, I don't know. I doubt that video creators would go through all of these old videos just to update them to a slightly higher bitrate; the other possibility is that YouTube kept the original uploads or higher bitrate variants without previously showing them (and only showed them now), but that seems like a huge waste of storage, so it seems unlikely to me. Again, we're talking about old uploads (2-3 or up to 10 years ago), not new ones.
The one thing I've seen that makes sense is updating old videos that were previously available at up to 480p and bringing them up to 720p or 1080p (with the idea of keeping the original published video with the views, comments and so on instead of uploading a new one).
After 14 months of Israel’s genocide on Gaza, conditions for the millions of displaced remain perilous and Israel’s airstrikes are unrelenting.
The genocide grinds on : Peoples Dispatch
After 14 months of Israel's genocide on Gaza, conditions for the millions of displaced remain perilous and Israel's airstrikes are unrelenting.Vijay Prashad (Peoples Dispatch)
Survey: Almost two-thirds of Malaysians hold favourable views of China, Malay perception improves significantly
Survey: Almost two-thirds of Malaysians hold favourable views of China, Malay perception improves significantly
KUALA LUMPUR, Dec 19 — Nearly two-thirds of Malaysians hold favourable views of the People’s Republic of China and believe that Malaysia-China relations are progressing well,...Kenneth Tee (Malay Mail)
Why China Isn’t Scared of Trump
Why China Isn’t Scared of Trump: U.S.-Chinese Tensions May Rise, but His Isolationism Will Help Beijing
U.S.-Chinese tensions may rise, but Washington’s isolationism will help Beijing.Yan Xuetong (Foreign Affairs Magazine)
like this
Dessalines likes this.
like this
☆ Yσɠƚԋσʂ ☆ and Dessalines like this.
China is it the enviable position of not having to pretend that Trump is good for anything. He is a bad choice for a president, so why would anyone be afraid.
Edit: well, ok Ukraine and by extension Europe should be afraid or at least concerned. I would shit myself if I were a US citizen that is not a millionaire or wealthier tho.
I would shit myself if I were a US citizen that is not a millionaire or wealthier tho.
Do not be alarmed. Democrats don't care about the working class, either.
It's why they would rather have 2 trump presidencies than 1 Bernie.
like this
Dessalines likes this.
Godot 4.4 Gets Native Jolt Physics Support
Godot 4.4 Gets Native Jolt Physics Support – GameFromScratch.com
With the release of Godot 4.4 Dev7 the Jolt physics engine is now available directly in Godot and will become the default physics engineMike (GameFromScratch)
like this
PandaInSpace likes this.
reshared this
Tech Cyborg reshared this.
Xi Jinping urges Macau to diversify economy away from casinos
Xi Jinping urges Macau to diversify economy away from casinos
Chinese president calls for city to ‘focus on cultivating new industries’ as he attends inauguration of new leaderAmy Hawkins (The Guardian)
Xi Jinping urges Macau to diversify economy away from casinos
Xi Jinping urges Macau to diversify economy away from casinos
Chinese president calls for city to ‘focus on cultivating new industries’ as he attends inauguration of new leaderAmy Hawkins (The Guardian)
Dessalines likes this.
traches
in reply to schnurrito • • •Samsy
in reply to schnurrito • • •schnurrito
in reply to Samsy • • •ddh
in reply to schnurrito • • •like this
TVA likes this.
sparky@lemmy.federate.cc
in reply to schnurrito • • •fmstrat
in reply to sparky@lemmy.federate.cc • • •afk_strats
in reply to fmstrat • • •fmstrat
in reply to afk_strats • • •afk_strats
in reply to sparky@lemmy.federate.cc • • •I've tried to main it on a few occasions most recently on 4.1. It's immensely powerful and I really think it surpasses Lightroom on ability to create pleasing tones. I have it installed on my home and laptop photo editing setup and I do use it on occasion.
Uortunately, even as an Adobe hater, I still use Lightroom CC 99% of the time. Why? Because speed and cross-platform compatibility. CC is less powerful* but I can do all of my editing in 30 seconds per photo and I have roughly the same experience accross Mac, Linux, and Android.
Darktable is slow to update, you have to be methodical, and there are so many ways to do the same thing. I know the devs are trying to make the best tool possible and I think they've built a gem. But I'm not invested enough to learn best practices for my photo editing software. I want a tool which gives me the happy path to the basics.
*ai masking, ai noise reduction, and ai object deletion are insanely useful. I feel bad every time I use them... But I do. Darktable doesn't have these
sparky@lemmy.federate.cc
in reply to afk_strats • • •Solemn
in reply to afk_strats • • •mondoman712
in reply to sparky@lemmy.federate.cc • • •RunningInRVA
in reply to schnurrito • • •sorter_plainview
in reply to RunningInRVA • • •Darohan
in reply to RunningInRVA • • •RunningInRVA
in reply to Darohan • • •waywardninja
in reply to schnurrito • • •Personally, I'm grateful this tool exists.
I have used Adobe Lightroom 5 the one you could get on a disc, like, when owning things was actually possible. Adobe has systematically pissed me off over the last decade. Lightroom was great, non-destructive edits, import into year with sub directory sorted by months. Quick copy and apply edits. Lr5 was great.
I'm just a hobbyist photographer, I'm not doing pro level anything or charging anyone anything. I would love to use the student edition. I refuse to though, because it requires Adobe to upload them online, use them for ai training, it's not private. I take photos on a camera to NOT have them on the Internet.
To be honest I'd be upset if a photographer used any ai or cloud storage for my personal photos. Sadly, it's so baked in a photographer might not even know. Not everyone cares or is tech savvy(which is totally fine) it's not their fault the company is shady.
That was a first issue, second they won't support the version I have any longer, ok that's how software/hardware works, but it's a subscription model now and
... show morePersonally, I'm grateful this tool exists.
I have used Adobe Lightroom 5 the one you could get on a disc, like, when owning things was actually possible. Adobe has systematically pissed me off over the last decade. Lightroom was great, non-destructive edits, import into year with sub directory sorted by months. Quick copy and apply edits. Lr5 was great.
I'm just a hobbyist photographer, I'm not doing pro level anything or charging anyone anything. I would love to use the student edition. I refuse to though, because it requires Adobe to upload them online, use them for ai training, it's not private. I take photos on a camera to NOT have them on the Internet.
To be honest I'd be upset if a photographer used any ai or cloud storage for my personal photos. Sadly, it's so baked in a photographer might not even know. Not everyone cares or is tech savvy(which is totally fine) it's not their fault the company is shady.
That was a first issue, second they won't support the version I have any longer, ok that's how software/hardware works, but it's a subscription model now and that sucks. I upload 6 months of photos at a clip, I didn't need a monthly sub. Because of that I'm tied to an old laptop that's on death's door to edit my pics.
Darktable provides everything I need that Lightroom did, sans a small bit of import magic to organize photos, and it's a little tricky to use but after about an hour, I understand how to get things going. Anything has a learning curve.
With darktable I know my pics are mine, they are on my laptop, I won't be paying a subscription. That small amount of frustration is worth it to tell Adobe to piss off.
TheFonz
in reply to schnurrito • • •