• sanpo@sopuli.xyz
    link
    fedilink
    arrow-up
    45
    ·
    3 months ago

    Did anyone actually test how fast it is compared to Dark Reader?

    Calling yourself “the fastest” is all nice and good, but some benchmarks would be nice.

    • jangdonggun@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Yep, people have benchmarked:

      • Firefox without any Dark Mode addon = 27 points Speedometer
      • With Dark Reader = about 11 points
      • With UltimaDark = 25-26 points
    • sag@lemm.eeOP
      link
      fedilink
      arrow-up
      6
      arrow-down
      17
      ·
      3 months ago

      Try it your self. Use a pretty low end device. You will see difference. It’s life saver for my eyes and pretty old computer.

  • lemmyvore@feddit.nl
    link
    fedilink
    English
    arrow-up
    29
    ·
    3 months ago

    Ah it doesn’t work on Android? A pity, that’s where I need dark mode the most.

  • JubilantJaguar@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    3 months ago

    Pro tip: Firefox can do dark mode natively, if you’re ready to accept some ugly websites.

    Settings > Manage colors > then set your preferred hues and Override to Always.

    It’s blazing fast with zero white flash, and most sites are perfectly legible.

    • fmstrat@lemmy.nowsci.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      3 months ago

      While I’m glad they’re trying this, it has the same problem as Brave, no configuration. Dark Reader lets you configure individual site profiles via a toggle of static/dynamic/etc to fix ones that don’t work well. Without that, nothing will compare.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    3 months ago

    Although it works well, this is so experimental, it makes lab rats look like seasoned professionals.

    Looks good, but I wait until its proven and stable.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        11
        ·
        edit-2
        3 months ago

        That doesn’t mean it’s stable. From his own description:

        This is still highly experimental so it can also ruin your internet experience

        • sag@lemm.eeOP
          link
          fedilink
          arrow-up
          4
          arrow-down
          5
          ·
          3 months ago

          Yea, I mean it will take eternity(not really) to become stable. xD

      • jangdonggun@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        Because of the fact that UltimaDark is going the hardest route, using a totally different API, unlike Dark Reader

  • rbesfe@lemmy.ca
    link
    fedilink
    arrow-up
    9
    ·
    3 months ago

    Dark Reader has been in development since 2014 and is much more polished

  • Bogasse@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    3 months ago

    On my rather old FP3 it spares me a few seconds per page load and the result seems quite comparable to dark reader.

    • sag@lemm.eeOP
      link
      fedilink
      arrow-up
      4
      ·
      3 months ago

      Yea, It uses native features of Gecko engines that’s why it’s faster than Dark Reader.

  • The Quuuuuill@slrpnk.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    Maybe I’m an idiot, but I can’t find a source link. Is this open source? I was curious about finding information comparing it to darkreader

  • karashta@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    Anyone tried this with twitch? I just get a gray screen instead of video. Anyone else? Really like this extension otherwise

      • karashta@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        The issue is more that the extension doesn’t seem to properly let sites bypass or something. I have to turn the extension off and refresh to get picture back.

      • ReversalHatchery@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        Websites can look at their own structure, and they can see the changes addons make to them, for example of a CSS property was changed or added.

        Maybe there are ways around that, like with the use of a shadow DOM, but I’m not a web developer

        • derek@infosec.pub
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          3 months ago

          That’s not true for all sites. If the page is static then it’ll have no clue. If it’s dynamic and running a client-side script to report this info back, and if that information is collected, then I can see how that might be a useful supplement for fingerprinting if the server owner is so inclined. At that point though I’m wondering why a security-conscious user is raw dogging the internet and allowing scripts to run in their browser without consent (NoScript saves browsers).

          Even then it’s unclear when/how altering the page to render it differently is commonly communicated back to the server, how much identifying information that talk-back is capable of conveying, and how we might mitigate those collections (wholesale abstinence and/or script control aside). What are the specific mechanisms of action we’re concerned about? This isn’t a faux challenge for the sake of hollow rhetoric. I’m ignorant, find the dialogue interesting, and am asking for help being less dumb. :)

          I found some brief and useful discussion in this Privacy Guides thread. Seems like the concern is valid but minimal for all but the most strict/defensive postures.

          Trying to validate this myself for Dark Reader without breaking out Wireshark and monitoring some big tech site while I toggle color modes (which I might do later if I think of it and find the time) I see Dark Reader is open source, an Open Collective member, and seems to engender little hand-wringing. The only public gripe I can find is this misguided Orion Browser feedback thread.

          Thanks for the interesting diversion!

          • ReversalHatchery@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Trying to validate this myself for Dark Reader without breaking out Wireshark and monitoring some big tech site while I toggle color modes (which I might do later if I think of it and find the time)

            You would also need to setup up a custom certificate authority to MITM the TLS traffic (a very blunt wording but to the point).
            I think you should be fine using the network tab in the normal browser devtools, or the one in the browser toolbox as that latter one is supposed to show all traffic your browser makes.

          • ReversalHatchery@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Yes, this is absolutely just a possibility for a website to do it. Actually it’s probably also quite complicated technically, but there are multiple services for recording precise user behaviour including all mouse movements on a website, so I would imagine there’s something for this, too.

            What are the specific mechanisms of action we’re concerned about?

            I was thinking about the website’s code running some light checksum on all the resources it has downloaded and loaded into the browser, and if it differs then upload the diff. I think it should work to find groups of people with a similar browser setup, but maybe it would fine just as browser fingerprinting too.