Javant
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Karna@lemmy.ml to Firefox@lemmy.ml · 8 个月前

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

external-link
message-square
55
link
fedilink
77
external-link

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

Karna@lemmy.ml to Firefox@lemmy.ml · 8 个月前
message-square
55
link
fedilink
Orbit by Mozilla is a new AI-powered assistant for the Firefox web browser that makes summarising web content while you browse as easy as clicking a
  • Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    2
    ·
    8 个月前

    Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.

    • Karna@lemmy.mlOP
      link
      fedilink
      arrow-up
      20
      ·
      8 个月前

      In such scenario you need to host your choice of LLM locally.

      • ReversalHatchery@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 个月前

        does the addon support usage like that?

        • Karna@lemmy.mlOP
          link
          fedilink
          arrow-up
          7
          ·
          8 个月前

          No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

          I have this setup running for a while now.

          • cmgvd3lw@discuss.tchncs.de
            link
            fedilink
            arrow-up
            4
            ·
            8 个月前

            Which model you are running? Who much ram?

            • Karna@lemmy.mlOP
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              8 个月前

              My (docker based) configuration:

              Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

              Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

              Docker: https://docs.docker.com/engine/install/

              Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

              Open WebUI: https://docs.openwebui.com/

              Ollama: https://hub.docker.com/r/ollama/ollama

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      2 个月前

      deleted by creator

    • Hamartiogonic@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      8 个月前

      According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

      If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

Firefox@lemmy.ml

firefox@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !firefox@lemmy.ml

/c/firefox

A place to discuss the news and latest developments on the open-source browser Firefox.


Rules

1. Adhere to the instance rules

2. Be kind to one another

3. Communicate in a civil manner


Reporting

If you would like to bring an issue to the moderators attention, please use the “Create Report” feature on the offending comment or post and it will be reviewed as time allows.


Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 157 users / day
  • 582 users / week
  • 1.24K users / month
  • 4.38K users / 6 months
  • 1 local subscriber
  • 20.7K subscribers
  • 599 Posts
  • 6.47K Comments
  • Modlog
  • mods:
  • k_o_t@lemmy.ml
  • golden_zealot@lemmy.ml
  • BE: 0.19.12
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org