• 18 Posts
  • 412 Comments
Joined 2 years ago
cake
Cake day: July 20th, 2023

help-circle

  • Dremor@lemmy.world
    shield
    MtoGames@lemmy.worldNew moderator for this community!
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    That’s indeed what we asked to do next time for big events. But when no such big event occurs, it’s OK.

    In my opinion, the following trailers are acceptable :

    • Announcement of a new game -> so we can start talking about that game
    • Announcement of end of early access or of game release (like Hades 2 recently) -> So we can talk about the game at its release state
    • Major update of updates of a game (as soon as it isn’t too often) -> So we can discuss how that specific update make that game evolve, for better or worse. I’d honestly still prefer those one to be kept to their dedicated communities, but one from time to time won’t kill anyone.

    Globally, that’s also a way to advertise dedicated communities for that specific game, so I think that’s alright.

    We’ll discuss all of this with the incoming community rules update.










  • Fair point. I do agree with the “clic to execute challenge” approach.

    For the terminal browser, it has more to do with it not respecting web standard than Anubis not working on it.

    As for old hardware, I do agree that a temporization could be good idea, if it wasn’t so easy to circumvent. In such case bots would just wait in the background and resume once the timer is fullified, which would vastly decrease Anubis effectiveness as they don’t uses much power to do so. There isn’t really much that can be done here.

    As for the CUDA solution, that will depend on the implemented hash algorithm. Some of them (like the one used by Monero) are made to vastly more inefficient on GPU than it is on the CPU. Moreover, GPU servers are far more expensive to run than CPU ones, so the result would be the same : crawling would be more expensive.

    In any case, the best solution would be by far to make it a legal requirement to respect robot.txt, but for now the legislators prefer to look the other way.


  • To solve it or not do not change that they have to use more resources for crawling, which is the objective here. And by contrast, the website sees a lot less load compared to before the use of Anubis. In any case, I see it as a win.

    But despite that, it has its detractors, like any solution that becomes popular.

    But let’s be honest, what are the arguments against it?
    It takes a bit longer to access for the first time? Sure, but that’s not like you have to click anything or write anything.
    It executes foreign code on your machine? Literally 90% of the web does these days. Just disable JavaScript to see how many website is still functional. I’d be surprised if even a handful does.

    The only people having any advantages at not having Anubis are web crawler, be it ai bots, indexing bots, or script kiddies trying to find a vulnerable target.











  • I did try to work for opensource company, but strangely none of them accepted .NET as an acceptable experience. So I had to either find an entry-level Java position, and cut my paycheck by half, or continue to work where I do while changing things from the inside.

    I already managed to introduce some open-source tools here and there (we now uses DBeaver instead of SSMS, Insomnia instead of Postman, among others), and intend to continue for as long as I can.

    As for the appointment, in about 70 years, according to the current life expectation.


  • You are talking about me, aren’t you ?

    If so, no, I don’t work for Mistral at all, but I do work for a company selling M$ products to businesses. You know, to pay rend, food, things like that.
    But M$ requires us to be certified to get prospects from them, and as such we are encouraged to do at least all basic certification relative to our field, which includes AI, Azure, C#, and the likes.

    That why I knew that the use of Shavian alphabet is mostly useless, as even a basic free AI is able to mostly decipher it. If a free one can, I’ll let to your imagination what a more advanced one can do.

    Now why did I use Mistral ? Simply because it happened to be installed on my phone for test purpose. I rarely use it, but I have to admit it is useful for specific scenarios. But once I can install an hardware accelereted local AI on my phone, Mistral can eat shit.