• BeN9o@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    4 days ago

    I’m actually happy to say this is me, I recently installed Mint on a separate m.2 drive from windows, I wanted to just test it. I now find myself almost permanently on Mint, only going back to windows once to play a multiplayer game that isn’t on Linux yet.

    • Mrkawfee@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      Same. There are one or two things that don’t work on Linux yet or are buggy so I have Windows on a separate drive. I hardly use it though.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      3 days ago

      I was triple booting a Hackintosh for a while and kept them on their own drives. You have to because Windows updates like to screw with the UEFI of the drive it’s install on at random time. Somehow, Window was less stable than OS X running on unapproved hardware.

  • altphoto@lemmy.today
    link
    fedilink
    arrow-up
    19
    ·
    4 days ago

    Best setup ever:

    1)install Linux on one drive.
    2)install Windows on a second drive.
    3)boot from grub on the first drive and add an entry to boot Windows.
    4)on a 3rd drive format it ext3 or optionally dos. Mount this puppy at /home or even /home/user.
    5)don’t let windows touch you Linux home drive ever. Fuck windows and Microsoft. Both can suck my entire ass. If you ever need to share files between these systems use a pen drive. Microsoft doesn’t deserve you. Just use it as a last resort, do your thing and GTFO ASAP.

    • wpb@lemmy.world
      link
      fedilink
      arrow-up
      28
      ·
      4 days ago

      I’ve got this setup, but optimized slightly:

      1. Install linux on one drive
    • aesthelete@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      4 days ago

      I used to run Windows on an esata drive that I would only power up occasionally in order to game, and it still somehow – and I don’t remember how – managed to ruin my computer.

      • altphoto@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        Yeah, isolated home drive is the way to go. You just nuke Linux and windows and restart but your stuff is safe.

    • SavvyWolf@pawb.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      Just a heads up to anyone reading this: Don’t format your home folder as FAT32/ntfs. Some stuff in there needs Linux specific permission bits and you might be limited in terms of maximum file size.

      Consider mounting at /home/usename/shared or something instead if you want a shared drive.

    • bollybing@lemmynsfw.com
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      Does this work to prevent Windows from fucking your bootloader in all cases? Also I dont quite get the importance of step 4?

      • altphoto@lemmy.today
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        Step 4 is in my opinion the most clever and important part… Basically if you remove your home drive and boot, you get a vanilla computer. If you put it back, you get your computer back…ie, if you fuck up your Linux or windows install you just remove your home, reinstall blind and put your home back in…like you never left!!! Plus if your drive for the os dies, you can just make another! Or you can even take your home folder with you from one Linux box to a new one in the blink of an eye…a very slow blink… Hold on, I’m still pulling the drive…open slowly… Done! See? Easy!

    • Wolf@lemmy.today
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      4 days ago

      What’s wrong with a VM? I set up a Win10 instance in VMM right after I switched to Linux full time 10 months ago, but I had to use it exactly once to configure the RGB on my keyboard, and haven’t had a reason to boot it up since.

      From what I understood, it runs on ‘Bare Metal’ which means that it theoretically should preform just as well as if you booted into it, with the only overhead being the *nix which is minimal.

      I’m not saying it’s better, I’m honestly asking because I have very little experience with it.

      I used to dual boot back in the day, but that was when I was still on HDDs and the long ass boot times meant I usually just stayed in Windows if I was planning on gaming that day.

      • areyouevenreal@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        4 days ago

        That’s not how that works. I think your confusing bare metal with bare metal hypervisor. The latter is meant to mean a Type-1 Hypervisor, which KVM isn’t anyway but that’s another story.

        Without GPU pass through you aren’t going to get nearly the graphics performance for something like gaming. I’ve also had issues with KVM and libvirt breaking during sleep. It’s a lot more janky than you make out.

        • Wolf@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 days ago

          Well it does seem to be a confusing subject, so forgive me for getting it wrong. I must have misunderstood or misremembered the information I read when setting up the VM 10 months ago. As I said, I have very little experience with them and was honestly just asking if it’s not almost as good. I wasn’t trying to ‘make it out’ to be ‘not janky’.

          According to Wiki, KVM " is a … virtualization module in the Linux kernel that allows the kernel to function as a hypervisor."

          I wasn’t aware that there was a distinction between a Hypervisor and a ‘Type-1’ Hypervisor, but now I know so thank you for clearing that up for me.

          Without GPU pass through you aren’t going to get nearly the graphics performance for something like gaming.

          According to this wiki, it seems like GPU passthrough is possible with KVM if your system supports IOMMU, mine does. But it looks like you also need a separate GPU to do that, so that answers my question about is it nearly as good as dual booting.

          Every game I have attempted to run has just worked and they seem to run just as good as they did in Windows, so I guess I’m lucky I don’t need to really worry about dual booting or VM’s. I was just kind of wondering if it would work if I did need it, since that seemed like it would be a lot simpler than booting into a different operating system.

          • areyouevenreal@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            Yes I know GPU passthrough is possible. Almost no one does it as consumer GPUs don’t normally don’t support the virtualization technologies that allow multiple OSes to use one GPU. It’s an enterprise feature mostly. There are projects like VirGL that work with KVM and QEMU, but they don’t support Windows last I checked, and are imperfect even on Linux guests. I think only Apple Silicon and Intel integrated graphics support the right technologies you would need. Buying a second GPU is a good option, although that has it’s own complexities and is obviously more expensive. Most modern consumer platforms don’t have enough PCIe lanes to give two GPUs a full x16 bandwidth. There is a technology in Windows called GPU paravirtualization to make this happen with Hyper-V, but you have to be using a Hyper-V host, not a Linux based one. It’s also quite finicky to make that work.

            Out of interest what games are you running that don’t need GPU performance? Basically any modern 3D game needs a GPU to run well. Obviously 2D games might not, though even that varies.

            All of the above is far more complex than setting up a dual boot. A dual boot can be as simple as having two different drives and picking which to boot from in the UEFI or BIOS firmware. I don’t understand why you think that would be less complicated than a high tech solution like virtualization.

            There are basically three types of virtualization in classical thinking. Type 1, Type 2, and Type 3. KVM is none of these. With Type 1 there is no operating system running bare metal, instead only the hypervisor itself runs as bare metal. Everything else, including the management tools for the hypervisor, run in guest OSes. Hyper-V, ESXi, and anything using Xen are great examples. Type 2 is where you have virtualization software running inside a normal OS. KVM is special because it’s a hypervisor running in the same CPU ring and privilege level as the full Linux kernel. It’s like if a Type-1 hypervisor ran at the same time as a normal OS in the same space. This means it behaves somewhat like a Type-1 and somewhat like a Type-2. It’s bare metal just like a Type-1 would be, but has to share resources with Linux processes and other parts of the Linux kernel. You could kind of say it’s a type 1.5. It’s not the only hypervisor these days to use that approach, and the Type 1, 2, 3 terminology kind of breaks down in modern usage anyway. Modern virtualization has gotten a bit too complex for simplifications like that to always apply. Type 3 had to be added to account for containers for example. This ends up getting weird when you have modern Linux systems that get to be a Type-1.5 hypervisor while also being a Type 3 at the same time.

            • Wolf@lemmy.today
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              Out of interest what games are you running that don’t need GPU performance? Basically any modern 3D game needs a GPU to run well.

              I think you misunderstood me. I said “Every game I have attempted to run has just worked and they seem to run just as good as they did in Windows, so I guess I’m lucky I don’t need to really worry about dual booting or VM’s

              The games I play do need GPU performance. Cyberpunk 2077, Red Dead Redemption 2, No Mans Sky, The Outer Worlds etc. I’m not running them in a VM, I’m running them through Steam or Heroic Games Launcher.

              I don’t understand why you think that would be less complicated than a high tech solution like virtualization.

              Because once you have everything set up properly, all you would need to do to play a game that you couldn’t play in Linux is fire up the VM and play it. In a dual boot situation you would have to reboot your computer into a whole different OS and then play the game. It wouldn’t be a massive difference, but it would be more convenient. Plus it would be contained so there would be no way for it to mess with your bootloader or whatever. Clearly it’s more complicated that I had originally thought.

              KVM is special because it’s a hypervisor running in the same CPU ring and privilege level as the full Linux kernel. It’s like if a Type-1 hypervisor ran at the same time as a normal OS in the same space. This means it behaves somewhat like a Type-1 and somewhat like a Type-2. It’s bare metal just like a Type-1 would be, but has to share resources with Linux processes and other parts of the Linux kernel.

              Ok, now you got me curious. What is the distinction between that and how I originally described it?

              From what I understood, it runs on ‘Bare Metal’ which means that it theoretically should preform just as well as if you booted into it, with the only overhead being the (Linux OS) which is minimal.”

              From my admittedly laymen understanding, it kinda seems like what you said and how I described it are pretty much the same thing.

              • areyouevenreal@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                2 days ago

                An OS or a hypervisor can run in bare metal. If I have Windows running in KVM, KVM is running bare metal but Windows isn’t. Ditto with ESXi or Hyper-V. In the case of your setup Linux and KVM are both bare metal, but Windows isn’t. KVM, ESXi, Xen are always running a privilege level above their guests. Does this make sense?

                The difference between KVM and the more conventional Type 1 hypervisors is that a conventional type 1 can’t run along side a normal kernel. So with Linux and KVM both Linux and KVM are baremetal. With Linux and Xen, only Xen is baremetal, and Linux is a guest. Likewise if you have something like Hyper-V or WSL2 on Windows, then Windows is actually running as a guest OS, as is Linux or any other guests you have. Only Hyper-V is running natively. Some people still consider KVM a Type 1, since it is running bare metal itself, but you can see how it’s different to the model other Type 1 hypervisors use. It’s a naming issue in that regard.

                It might help to read up more on virtualization technology. I am sure someone can explain this stuff better than me.

                • Wolf@lemmy.today
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 days ago

                  Does this make sense?

                  It’s still a little confusing but I get what you are saying. I’ll look into it. Thanks for explaining :)

  • Imacat@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    27
    ·
    4 days ago

    Last time I booted into windows it wiped my grub partition. That was the day I decided I didn’t really need windows anymore.

    • Carrot@lemmy.today
      link
      fedilink
      arrow-up
      7
      ·
      3 days ago

      For anyone that needs to hear this, the way to prevent this is to have Linux and Windows on separate drives.

    • TheRedSpade@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      4 days ago

      I still don’t understand gta online. For me the whole point of the GTA games was that you could do anything without a single thought because you were the only real person involved. That disappears when you add other people.

      • juipeltje@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        4 days ago

        Well i have to admit i’ve actually been treating gta online as a single player grind game for the most part. On ps4/5 i did play together with a friend of mine though, but playing in a lobby with randoms can definitely be frustrating, especially if you are a grinder because lots of people like blowing your shit up. I’m honestly still shocked that rockstar allows you to pretty much do everything in invite only lobbies now, because i remember having to do all kinds of tricks with my internet connection to get into a public lobby by myself.

      • mybuttnolie@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        4 days ago

        for me it was that i was able to play with friends. i don’t have any, but if i did, we would’ve had some fun with heists.

    • doktormerlin@feddit.org
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 days ago

      sadly if you do PC gaming you are sometimes forced to use windows. If all your friends play League of Legends it doesn’t help you to say “but DotA 2 runs on Linux”, you need windows or you can’t play with your friends. Same goes for lots of Multiplayer games like GTA Online.

      All the tools and stuff I agree. But for games there just is no option other than Windows sometimes

      • Marduk73@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        Every situation is different. I used to pc game from 90s on. I now just steam some games, console and VR the rest. CAD is done with FreeCAD, GIMP / 'shop, I only solo play games. My few friends from 40 years ago just do board games and disc golf.

        Again situations are different. What i stated is accurate in my case.

        • doktormerlin@feddit.org
          link
          fedilink
          arrow-up
          1
          ·
          3 days ago

          I did never say anything against it, I just said why that’s not feasible for everyone.

          BTW: If you don’t want to use FreeCAD, Fusion360 works completely fine in Linux using this helper. Again: this is not to say that you should not use FreeCAD, it’s just more information for other people reading this thread.

  • hansolo@lemmy.today
    link
    fedilink
    arrow-up
    20
    ·
    4 days ago

    The day I wiped all partitions from my dual boot and started fresh with no windows on the machine was a revelation. My heart sang and my soul wept with joy. Windows lives in a caged state now, a neutered monster I rarely demand dance for me because it is ugly and awkward and on an external drive I don’t care about.

  • MudMan@fedia.io
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    4 days ago

    So in my dual boot setup Linux messes up the dedicated audio card so bad it not only sounds like ass on Linux but it somehow garbles Windows audio until I power cycle the entire thing. It is entirely possible it does permanent damage to the hardware. Some of the electrical clicks you hear from it are genuinely concerning.

    Had to plug in Linux audio via the motherboard audio and use different sources for each OS to work around it.

    Does change how the meme reads to me.

    Also, maaaan does Linux need to completely redo its audio systems from the ground up. It’s so bad that saying that isn’t even that controversial, which is insane in these circles.

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      4 days ago

      What distro? What sound card?

      You might try something new that runs pipewire by default, if you haven’t already. But I might also know of some specific quirks with specific cards.

      • Shardikprime@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        What distro? What sound card?

        Things a random user of Windows never asks themselves in their lives

        • brucethemoose@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 days ago

          Only because the sound card is exclusively designed for windows.

          It’s not that way anymore. I actually can’t configure gain (and some other features) for my Fiio KA3 on Windows. Now Android (and iOS) are their main priority.

          Which does give the useful quirk of allowing me to configure it in desktop linux…

          This is going to be a pattern though. It won’t necessarily get better for the Linux desktop, but Windows is going to increasingly feel the pain of being a “lower priority” OS for hardware.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        8
        ·
        4 days ago

        We’re not doing this. People in the Linux community are so tweaked by years of bad support that they assume every complaint is a call for help.

        It is not.

        I know what’s broken, I know why, I know it’s not easily fixable, I have a workaround. This is not a tech support thread.

        I don’t need information from users more savvy than me, I need the bad sound firmware they’re loading in lieu of specific support for my audio card to be fixed, or even better, replaced by actual specifically supported firmware so my card works. In the meantime, crappy on-board audio and wasting money on hardware I’m not using it is.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            That’s fair, and I don’t have a problem with that. I’m just annoyed by the tendency of the community to react to criticism with technical advice, which I find to be a frustrating crutch.

            FWIW, the card is a Sound Blaster X AE5 (that name sure has aged poorly), and I’ve had similar issues with it in both a Manjaro and a Bazzite install.

            • brucethemoose@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              3 days ago

              IMO that reaction is healthy, as long as it isn’t a hostile “you’re holding it wrong” (which was not my intent, and is very much a community problem). Communal troubleshooting is the nature of the Linux desktop.

              If you don’t want advice, that’s fine, probably reasonable based on what you described. But I have had some similar (but not so severe) issues with Fioo and Xonar cards that got fixed with some low level configs I had no idea existed.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                1
                ·
                3 days ago

                And maybe I could get to some more in-depth solution that sorts it out, but that’s me spending time on a problem that a) I shouldn’t have to, and b) I have a functional workaround for already.

                Communal troubleshooting is the nature of Linux desktop, but also a massive problem. You shouldn’t need communal troubleshooting in the first place. It’s not a stand-in for proper UX, hardware compatibility or reliable implementation. If the goal is for more people to migrate to Linux the community needs to get over the assumption that troubleshooting is a valid answer to these types of issues.

                Which is not to say the community shouldn’t be helpful, but there’s this tendency to aggressively troubleshoot at people complaining about issues and limitations and then to snark at people actively asking for help troubleshooting for not reading documentation or not providing thorough enough logs and information. I find that obnoxious, admittedly because it’s been decades, so I may be on a hair trigger for it at this point.

        • GreatAlbatross@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          If it helps for a future purchase, Focusrite’s external interfaces have been amazing for Linux support.

          To the point where I didn’t even notice; It just worked perfectly out of the box.

          I’m assuming you’ve already checked this, but is your interface set to the same frequency/bit depth between Linux and windows? Or if it uses optical, whether it’s set to the same word clock source.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            I tried fiddling with the Windows settings, but that didn’t fix it immediately, and the sound is clearly wrong on Linux even with a power cycle. And googling for it I’m not alone in having issues and support for the thing is patchy. I mean, rebooting should have fixed it anyway. There’s no reason why either OS wouldn’t initialize those things on boot.

            I am not particularly commited to the thing, so I wouldn’t buy an upgrade. The only reason I have it is at some point I ended up with a motherboard that wouldn’t do 5.1 out of the box, so I got something relatively affordable to slap in there. It sounds noticeably better than integrated audio, though, so now that I have it I’d like to use it, even if I’m not on the problematic old motherboard.

            But again, I dislike the tendency to recommend functional hardware or technical support. It’s kinda frustrating. And frankly, it works on Windows, so if I was looking for a fix, that’s right there. The onus is on Linux for support in this type of setup where the issue is not on the Windows side that’s a reboot away.

    • palordrolap@fedia.io
      link
      fedilink
      arrow-up
      8
      ·
      4 days ago

      From the ground up has been done at least once, but given there are multiple layers of interface and driver, it might not be at the right level for whatever hardware you have.

      I’m thinking specifically of how pipewire recently came along and basically took over the functions previously provided by pulseaudio, to the point of pretending to be Pulse where necessary so that things don’t break.

      FWIW, I recently learned that my motherboard has features that weren’t unlocked by default in my distro. Not related to sound, mind you, but nonetheless, I’ve gained access to that now. It required loading an extra kernel module. The same might be required to get the best out of your sound card.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        4 days ago

        Nah, it’s just not supported. Or rather, it’s poorly supported so it sounds worse than in Windows and it just doesn’t want to properly dual boot without a power cycle. Honestly, I haven’t checked if the soft reboot issue has been reported. Pretty sure it hasn’t. I could be nice and go find where to file a bug, but I haven’t gotten around to it and, frankly, there are enough other problems with this particular setup that nobody is fixing and are getting dismissed with “it’s the manufacturer’s fault” that I’m not particularly inclined to go out of my way.

        We don’t talk enough about how spotty new motherboard support is for Linux, either. At least sound is a recurring talking point. But yeah, newer motherboards often don’t pick up networking and audio hardware out of the box and need a lot of troubleshooting. Everybody is so proud of how well Linux revitalizes old laptops but nobody likes to talk about how that’s because they’re old, and newer stuff may not work well or at all. Early adopting hardware platforms on Linux can be a “going on an adventure” Hobbit meme experience.

        And you’re right that it’s not so much about audio getting reingeneered again as it getting done right. I just don’t know that the current patchwork barely holding together can be salvaged by bolting more pieces on top. Every time Linux needs to replace something this way it’s a years-long argument between nerds and a whole damn mess (see Wayland still being litigated, somehow). Audio never gets enough attention anywhere and I have very low trust that a new attempt wouldn’t end up in the same mess they have now, at least for a long while. It extra sucks because Windows audio used to be kinda bad, but now it’s… kinda not? So being a dual-booter it’s just an extra reason to make that choice of which boot option to pick from the menu.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        4 days ago

        You would friggin’ think.

        And yet.

        Turns out a DAC is a fairly complicated and self-sufficient bundle of software and hardcore electrical circuitry, so apparently you can mess it up so badly it will remain broken across soft reboots. Who knew!

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          My guess is that somehow data left in ram is causing the driver to get screwy. However, that still wouldn’t make sense as it shouldn’t be using uninitialized memory.

          I guess do a full power off before switching OS. The other option is to just run a Windows VM.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            1
            ·
            3 days ago

            The other option is to just use Windows, which is the issue with these types of hardware incompatibility. I also have an ASUS laptop that won’t use their slightly weird speakers correctly and, again, that whole thing makes it much less practical to boot into Linux instead.