

Using Traefik outside of k8s is for masochists. Especially after configv2. Caddy is by far the easiest reverse proxy to configure and has the sanest defaults.
Using Traefik outside of k8s is for masochists. Especially after configv2. Caddy is by far the easiest reverse proxy to configure and has the sanest defaults.
I wasn’t looking for technical support. You can do everything correctly and still get your mails randomly marked as spam or not delivered at all. This has happened to us, some of our customers, multiple smaller email providers as well as several municipalities (imagine blackholing government emails, what a grand idea). They don’t send sensible return headers, they might not even return your undelivered mail at all, they won’t react to any inquiries to their postmaster contact (or anywhere else really), they will blacklist entire IP blocks sometimes. The only way to sidestep any issues with them is to pay a few thousand bucks to enter their cool kids club certified sender alliance, which is what the big marketing firms use to deliver mass amounts of unwanted ads unhindered through their networks.
It is cheap, but the performance leaves much to be desired and their technical support is piss poor.
I’ve had the opposite experience with their cloud services in a professional context. My biggest gripe is with United Internet, the monopolistic company that owns IONOS, 1&1 (an ISP) as well as the ad-ridden, flaming pile of garbage that are GMX and WEB.DE, two of the most popular email service providers in Germany as well as a constant source of pain for anyone operating an Email server. They will ignore common industry standards and best-practices, silently block your mailserver for absolutely no reason, not respond to inquiries and just generally make the internet a slightly worse place for small to medium sized businesses and selfhosters.
It’s an alternative, but IONOS honestly fucking sucks as well, so I’m feeling pretty ambivalent about this.
There’s a link for alternate packings on that page, where you can see older versions, some with more aesthetically pleasing patterns of minimal tilted squares or symmetry. All of them use a larger value for s
though and it’s hard to tell where your version would fit in.
Probably why the comment you replied to didn’t say that at all.
Of course. I’m literally shaking with pure rage.
You have no idea how the people complaining about video game prices spend their money. You just disagree with them and make shit up apparently.
1992 was a very different time with very different market conditions and consumer behaviour for video games. Games used to have a much greater perceived entertainment value, despite their relatively small development budgets compared with today. They were also entirely physical media and renting was still a very common way to play them. From what I remember, it wasn’t the most financially accessible hobby either. Most of my friends growing up didn’t have permanent access to their own gaming console and not everyone that did had all the latest games. Nowadays, the gaming market is completely saturated with high quality titles, most of which are fairly cheap as well if you don’t buy them on release.
In any case: Super Mario Bros 3 came out in 1988 and released 1990 and 1991 for the US and Europe respectively. It also didn’t cost $59 and your inflation calculation seems off…
It is true and has been my experience for the last decade or so. Unfortunately, OP is trying to use a GPU from 2015 that’s still based on GCN 1.0 with the newer amdgpu driver stack, which is not officially supported. Effectively, OP is getting a taste of what it was like before AMD started pouring ressources into their open source GPU drivers.
Imagine a tool that gives you a language in which you can describe the hardware resources you want from a cloud provider. Say you want multiple different classes of servers with different sets of firewall rules. Something like Terraform allows you to put that into a text-based form, make changes to it, re-run the tool and expect resources to be created, changed and destroyed to match what you wrote down.
We need Mozilla corp to be better and there is currently no good way of forcing that to happen.
The Phoronix comment section has always been kinda shit. Maybe one in every thousand posts will contain anything of value (in most cases a comment by a developer telling the peanut gallery why they’re wrong).
The guidelines for Windows developers kinda suck tbh. Maybe it’s better these days, but plenty of weird legacy software behaviour can be blamed on MSDN.
Sorry, no idea. I think I’ve only ever watched other people play multiplayer supcom and the few tutorials I watched were for Forged Alliance Forever. This is the kind of stuff I was watching a decade ago. Check it out if you’re into old gameplay videos with a crusty mic track. :D
If your mass storage is full, any excess is wasted, so you should always try to make sure that it is being spent on something useful. SC1/FA incentivizes a constant balancing of your economy.
You can reclaim mass from dead units (even civilians), buildings, rocks, trees etc. Trees also give a bunch of energy which can be useful very early on. A failed attack will quite often turn out to be a mass donation that gets recycled into an army for realiation. All of this might be balanced differently between the different versions of the game, so I can’t be sure that it applies all that well to base SC1 vs. FA or even FAF.
Don’t build all your energy reactors in a big cluster if you can help it. One well placed attack will blow the whole thing up in a chain reaction. On the other hand, sabotaging your opponents power grid is often a solid strategy.
If anyone is interested in FAF and wants to take a look at some different levels of gameplay, check out GyleCast on YouTube.
As always it depends on the price you pay for your upgrade. Why the 3080 specifically? What kind of performance uplift are you expecting? A 3080 is probably gonna fall asleep at 1080p.
A CPU bottleneck won’t give you crashes by itself, unless there’s something else wrong with your system. It just means that as you reach higher framerates (whatever the cause, better GPU, lower settings or resolution etc.) your CPU will be the limiting factor. For most games that might still be comfortably within your expectations, even if your GPU isn’t being fully utilized. The main outliers are mostly esports games on lower settings and resolutions. If you play graphically demanding games on high settings or want to upgrade to a higher resolution at some point, then knock yourself out I guess. I find it kinda hard to guesstimate how “efficient” a GPU upgrade will be, since it depends on so many factors. If you can, then you may as well just give it a try, see how you like it and send it back if you’re disappointed (assuming that’s easy to do where you live).