Download UltimaDark - The Fastest Dark Mode Extension for Firefox. UltimaDark uses agressive & smart techniques to turn even the sunniest websites into realms of darkness.
Although it works well, this is so experimental, it makes lab rats look like seasoned professionals.
Go ahead, embrace the shadows! 🦇
That’s not true for all sites. If the page is static then it’ll have no clue. If it’s dynamic and running a client-side script to report this info back, and if that information is collected, then I can see how that might be a useful supplement for fingerprinting if the server owner is so inclined. At that point though I’m wondering why a security-conscious user is raw dogging the internet and allowing scripts to run in their browser without consent (NoScript saves browsers).
Even then it’s unclear when/how altering the page to render it differently is commonly communicated back to the server, how much identifying information that talk-back is capable of conveying, and how we might mitigate those collections (wholesale abstinence and/or script control aside). What are the specific mechanisms of action we’re concerned about? This isn’t a faux challenge for the sake of hollow rhetoric. I’m ignorant, find the dialogue interesting, and am asking for help being less dumb. :)
I found some brief and useful discussion in this Privacy Guides thread. Seems like the concern is valid but minimal for all but the most strict/defensive postures.
Trying to validate this myself for Dark Reader without breaking out Wireshark and monitoring some big tech site while I toggle color modes (which I might do later if I think of it and find the time) I see Dark Reader is open source, an Open Collective member, and seems to engender little hand-wringing. The only public gripe I can find is this misguided Orion Browser feedback thread.
Yes, this is absolutely just a possibility for a website to do it. Actually it’s probably also quite complicated technically, but there are multiple services for recording precise user behaviour including all mouse movements on a website, so I would imagine there’s something for this, too.
What are the specific mechanisms of action we’re concerned about?
I was thinking about the website’s code running some light checksum on all the resources it has downloaded and loaded into the browser, and if it differs then upload the diff. I think it should work to find groups of people with a similar browser setup, but maybe it would fine just as browser fingerprinting too.
Trying to validate this myself for Dark Reader without breaking out Wireshark and monitoring some big tech site while I toggle color modes (which I might do later if I think of it and find the time)
You would also need to setup up a custom certificate authority to MITM the TLS traffic (a very blunt wording but to the point).
I think you should be fine using the network tab in the normal browser devtools, or the one in the browser toolbox as that latter one is supposed to show all traffic your browser makes.
That’s not true for all sites. If the page is static then it’ll have no clue. If it’s dynamic and running a client-side script to report this info back, and if that information is collected, then I can see how that might be a useful supplement for fingerprinting if the server owner is so inclined. At that point though I’m wondering why a security-conscious user is raw dogging the internet and allowing scripts to run in their browser without consent (NoScript saves browsers).
Even then it’s unclear when/how altering the page to render it differently is commonly communicated back to the server, how much identifying information that talk-back is capable of conveying, and how we might mitigate those collections (wholesale abstinence and/or script control aside). What are the specific mechanisms of action we’re concerned about? This isn’t a faux challenge for the sake of hollow rhetoric. I’m ignorant, find the dialogue interesting, and am asking for help being less dumb. :)
I found some brief and useful discussion in this Privacy Guides thread. Seems like the concern is valid but minimal for all but the most strict/defensive postures.
Trying to validate this myself for Dark Reader without breaking out Wireshark and monitoring some big tech site while I toggle color modes (which I might do later if I think of it and find the time) I see Dark Reader is open source, an Open Collective member, and seems to engender little hand-wringing. The only public gripe I can find is this misguided Orion Browser feedback thread.
Thanks for the interesting diversion!
Yes, this is absolutely just a possibility for a website to do it. Actually it’s probably also quite complicated technically, but there are multiple services for recording precise user behaviour including all mouse movements on a website, so I would imagine there’s something for this, too.
I was thinking about the website’s code running some light checksum on all the resources it has downloaded and loaded into the browser, and if it differs then upload the diff. I think it should work to find groups of people with a similar browser setup, but maybe it would fine just as browser fingerprinting too.
You would also need to setup up a custom certificate authority to MITM the TLS traffic (a very blunt wording but to the point).
I think you should be fine using the network tab in the normal browser devtools, or the one in the browser toolbox as that latter one is supposed to show all traffic your browser makes.