I got it for free with a CPU I bought. Played roughly 3 hours before I stopped. It was just too boring.
I got it for free with a CPU I bought. Played roughly 3 hours before I stopped. It was just too boring.
Exchange contact info with someone after the accident then that someone is extremely petty. That’s my guess.
Considering the entire 200+ years, the US is considerably less evil. Is there room for improvement? Yes.
Honestly not that stupid. I have seen SD cards break. And for certain applications, like professional photography, having a more physically reliable medium is a good thing.
But I think cameras with dual SD cards for redundancy are more important.
Aspartame has about the same amount of calories as sugar (4kcal per gram). But it’s much more sweet so you need very little of it. So there is a very tiny amount of sweetener which does contain calories but it’s rounded down to 0.
There’s software that can do this for you as well. I pass both input and output over the network with no noticeable latency (via ehernet).
If you’re using Linux and pipe wire, this is supported out of the box:
pactl load-module module-native-protocol-tcp port=4656 listen=IPADDRESS
pactl load-module module-tunnel-sink server=IPADDRESS tcp::4656
They’re other methods on different OS’s, this is just what I use.
Interesting, I’ll have to look at the source article.
But as far as I’m aware the total amount of nuclear power has been decreasing in recent years. This might change with China’s future plants.
I’ve also read about small modular reactor designs gaining traction, which would help alleviate the heavy costs of one off plants we currently design and build.
Not saying the source is wrong, just saying that’s what I used to form my opinion.
I think that’s too simplistic of a view. Part of the high cost of nuclear is because of the somewhat niche use. As with everything, economies of scale makes things cheaper. Supporting one nuclear plant with specialized labor, parts, fuel, etc is much more expensive then supporting 100 plants, per Watt.
I can’t say more plants would drastically reduce costs. But it would definitely help.
“Sure, you can do everything it does with a phone”
No, you can’t do everything with a phone. A phone doesn’t have the same radios, GPIO for expandability, IR transceiver, etc. Not to mention the radios a phone does have doesn’t like it when you start forcing it to do fun things.
I’m not sure what definition of UBI you’re using, but not all forms of UBI need to cover the entirety of living expenses. UBI is just having income without strings attached. This very study is showing that even small amounts of money can help people get out of shitty situations.
Also as someone who lives in Dever, it’s not that expensive. Sure $1500+ is what you’ll pay around LoDo, but there are plenty of cheaper places.
I’m on lemmy.ml, looks to still be federated?
You don’t need to run a hot water line, a lot of models just use electricity to warm a small tank of water. This will work better then a hot water line since you would have to wait till you flush the cold water out of the line. Unless you have a recirculation pump for your hot water I guess.
It’s confirmed steam deck compatible at launch, so it’ll work fine.
I wish more gaming benchmarks for CPUs included games that are CPU heavy. The only one in this list seems to be Total War, but I skimmed the video and I’m not sure if they even tested campaign turn times vs battle.
Show us Stellaris, Civilization, Oxygen Not Included, etc. Some of them aren’t that popular, but they would at least be a good indicator for future CPU bound games.
That’s a very good point, but a little misleading. A better number would be to add up all the top tier cards from every generation, not just the past 2. Just because they’re old doesn’t mean they still aren’t relatively inefficient for their generation.
If we kept the generations exactly the same, but got rid of the top 1 or 2 cards. The technological advancement would be happening just as fast. Because really, the top tier cards are about silicon lottery and putting as much power in while keeping stable clocks. They aren’t different from an architecture perspective within the same generation. It’s about being able to sell the best silicon and more VRAM at a premium.
But as you said, it’s still a drop in the bucket compared to the overall market.
I understand the sentiment, but it seems like you’re drawing arbitrary lines in the sand for what is the “correct” amount of power for gaming. Why waste 50 watts of GPU (or more like 150 total system watts) on a game that something like a SteamDeck will draw 15watts to do almost identically. 10 times less power for definitely not 10 times less fidelity. We could all the way back to the original Gameboy for 0.7 watts, the fidelity drops but so does the power. What is the “correct” wattage?
I agree that the top end gpus are shit at efficiency and we should could cut back. But I don’t agree that fidelity and realism should stop advancing. Some type of efficiency requirement would be nice, but every year games should get more advanced and every year gpus should get better (and hopefully stay efficient).
If you like RPGs in general, I think it’s worth playing. No need be a fan of DnD.
Exactly. I should have expanded further, but I was including Forgotten Realms as part of the D&D brand.
This isn’t new, at all. They’re just being more transparent about it. It feels shitty that transparency is met by outrage stemming from ignorance. Just buy from GoG.