There's a workaround that involves going to about:config and setting xpinstall.signatures.required to false.
However, if you're running the Stable or Beta version, it will only work under Linux. On Windows and MacOS you'll need to download Nightly or the Developer Edition.
To fix this on MacOS I did the following:
1. Downloaded and installed Firefox Nightly
2. Ran /Applications/Firefox\ Nightly.app/Contents/MacOS/firefox-bin --profilemanager
3. Changed the profile to "default" so my normal Firefox profile would be used
4. Started up Firefox Nightly, opened about:config, then set xpinstall.signatures.required to false
Not sure if it's a good idea to use my default profile in Nightly. It might be a wiser idea to copy it instead.
Upgrading your profile from Release to Nightly, which occurs automatically when you open it with Nightly, is a one-way irreversible step. This could prevent your profile from being used with Release without crashes, or lose profile data such as bookmarks or saved passwords when later used with Release, depending on what work is underway in Nightly and if it happens to be backwards-compatible. Be sure to backup your profile if you choose to switch channels.
Note: I am told that Developer channel uses a separate profile, but there are instructions below showing people how to override that, at which point this warning becomes relevant once again.
FWIW I started using beta, nightly and the old "UX" channel, first on Mac and then on Linux, and before I knew it could be a problem I switched between them with the same profile all the time. Maybe there were subtle bugs I wasn't aware of, but nothing I ever noticed.
I haven’t run into any issues in a while, but you only have to get hit by lightning one time to lose your profile data. Best to be consciously careful about it.
I do agree, and I'm more careful now. Always keep a backup, at the very least. I now symlink ~/bin/firefox to nightly because some apps seem to have it hardcoded to open "firefox" rather than what's set as default.
Yes, the risk remains. If I read this right (from my phone), Release is 66, Developer is 67, Nightly is 68. This isn’t guaranteed to be a problem, but it’s not guaranteed okay either. YMMV.
That’s a good point. However, some of the instructions below specifically tell people how to force any channel onto using the existing Release profile. I’ll update my post.
And I told the developer edition to use my regular profile because that's the one that has all my settings and add-ons and I didn't realize the risk was there. Guess at this point all I can really do is hope and cross the bridge when I get there.
Looks like it would have been better to copy the profile instead. I managed to get most of my profile back using Firefox Sync, though for some reason it didn't transfer across my preferences and I had to redo those.
And frankly, this an extra absurdity on top of that. If you’re going to require signatures for all extensions, regardless of user preference, shouldn’t you be keeping an eye on the signing process?
Why does Mozilla do this? Same with removing the option to not update. Why not let users choose (in the case of update maybe with an about config setting)?
Because (stable) users are dumb, are easily manipulated and can't be trusted. Thus the mothership has to be in control for the greater good.
They also argue that enduser computers are already effectively "compromised" from a mozilla perspective because adware runs installers with admin privs and thus could insert things into the program folders. Thus anything the user can do adware could do too and therefore they can't give them any choice.
They put it in nicer words though.
To their credit, you can opt out but only if you switch to dev edition, nightly or custom builds, which either is a one-way road since downgrades corrupt profiles or tedious because you don't receive auto-updates.
But what they should really have done is allowing additional signing roots. Even secure boot does that.
This sounds like a threat model and mitigation developed by a college intern.
How, exactly, is a user land application going to protect itself from modification by a computer admin? I think DRM, anti-virus, and os vendors everywhere would love an answer to this.
This threat model completely fails to account for live patching, trusted cert root modification, dll hooking, etc. Either the Mozilla security folks are incompetent / winging it, or this isn't the real reason.
I get the ostensible justification, but attacking this way requires the user to dig into the obscure dev settings and load an xpi from outside the browser[1]. Is there even one case of a user compromised that way?
[1] or at least they could have allowed that as a compromise
I updated my previous comment. They say there exist crapware installers that use elevated privileges that do inject stuff into the browser and that's why we can't have nice things, yes.
But I disagree with their value tradeoffs. They want to add a little "protection" - which is really flimsy since there is no privilege separation - for users who already compromised their systems with adware at the expense of the freedom of everyone else.
I'm totally fine with software already running on my machine being able to install addons into my browser. It can also already install a keylogger and record the screen, what's the big deal?
It is not possible for a user land application to prevent root processes from hijacking / modifying it. Such protection requires the protecting mechanism to run at a higher level of trust / security ring than the attacker.
It is probably safer to use an unbranded build with the same version as the currently installed Firefox (take note that it will not update). Page with links to the latest release builds: https://wiki.mozilla.org/Add-ons/Extension_Signing
What timezone are you in? I'm in UTC-4 (Detroit), and haven't seen any problems so far. (Also running Nightly on Arch Linux - I haven't made any previous changes to the addon signing either)
To clarify, by 'not working' I meant none of the addons with signing issues are re-enabled after changing xpinstall.signatures.required. I might have wrongly assumed this would happen. However, I tried installing a new addon I had never installed before and that works, but reinstalling one that I had previously installed still doesn't, even after uninstalling it (uBlock Origin).
My timezone is America/Los_Angeles.
EDIT: Sorry, I'm dumb. I actually have two versions of FF installed and I chose the one that wasn't Nightly.
Update: We have rolled out a partial fix for this issue. We generated a new intermediate certificate with the same name/key but an updated validity window and pushed it out to users via Normandy (this should be most users). Users who have Normandy on should see their add-ons start working over the next few hours. We are continuing to work on packaging up the new certificate for users who have Normandy disabled.
I've been through all of Firefox `about:config` a few times in the past, fixing preferences to, e.g., try to disable umpteen different services that leak info or create potential vulnerabilities gratuitously, but this is the first I recall hearing of Normandy.
Apparently I missed `app.normandy.enabled`, because I think I would've remembered a name with connotations of a bloody massive surprise attack.
Incidentally, `app.normandy.enabled` defaults to `true` in the `firefox-esr` Debian Stable package. Which seems wrong for an ESR.
For personal use (not development), I run 3 browsers (for features/configurations and an extra bit of compartmentalization): Tor Browser for most things, Firefox ESR with privacy tweaks for the small number of things that require login, and Chromium without much privacy tweaks for the rare occasion that a crucial site refuses to work with my TB or FF setup.
Today's crucial cert administration oops, plus learning of yet another very questionable remote capability/vector, plus the questionable preferences-changing being enabled even for ESR... is making me even less comfortable with the Web browser standards "big moat" barrier to entry situation.
I know Mozilla has some very forthright people, but I'd really like to see a conspicuous and pervasive focus on privacy&security, throughout the organization, which, at this point, would shake up a lot of things. Then, with the high ground established unambiguously, I'd like to see actively reversing some of the past surveillance&brochure tendencies in some standards. And also see some more creative approaches to what a browser can be, despite a hostile and exploitive environment. Or maybe Brave turns out to be a better vehicle for that, but I still want to believe in Mozilla.
I too use Debian's Firefox ESR. I noticed the "Allow Firefox to install and run studies" option in Privacy & Security Preferences a long time ago. It was unchecked and greyed out (i.e., unclickable), and a label below it says "Data reporting is disabled for this build configuration", so I gave it no further thought. This morning I woke up and launched Firefox, noticed this headline, and then noticed my extensions were still running. I looked in about:config and lo and behold, app.normandy.enabled=default [true]. I'll be filing a bug with debian to disable this in the build configuration.
Edit: There are some questions about whether Normandy is really enabled in Debian Firefox ESR even if the about:config setting defaults to true. I've filed a bug report, and I'm sure once a Debian maintainer has a chance to look at it we'll find out the answer.
I had mine disabled. So let's think about this for a second. If I disable a security hole that you can drive a semi-truck through, I remain foobar'd. If I run my "secure" firefox configuration, with the security hole enabled, then they un-foobar me first. Before anyone else. So I could effectively get rewarded, for always keeping a security hole open. But I didn't keep it open, so... yeah... they'll get around to me sometime.
?
>:-(
Grrr.
I'm just getting old and curmudgeonly maybe? I've decided though, I'm starting an animated security blog to show people the ludicrousness of all this kind of stuff in plain language. I'll be Statler, and I just need someone to be Waldorf. Because this stuff really is getting Statler and Waldorf level ridiculous.
We need people with standards in this industry, because that's the only source we have of market signals that prevent the market from going full user-hostile.
>If I disable a security hole that you can drive a semi-truck through, I remain foobar'd. If I run my "secure" firefox configuration, with the security hole enabled, then they un-foobar me first. Before anyone else. So I could effectively get rewarded, for always keeping a security hole open. But I didn't keep it open, so... yeah... they'll get around to me sometime.
That's needless drama. They will be rolling out the fix in a point release. Whatever way you use to update your browser will install that and get the fix. So the worst case is just going back to the old days where you'd have the issue until your distro issued a new package or you manually updated the browser version on Windows or OSX. What exactly would you expect that's not exactly what's happening?
But the point here is not about integrity, confidentiality, or availability. It is about whether you trust Mozilla, and how much trustworthy they are.
A configuration where Mozilla cannot push remote updates is neither more secure nor less secure. Mozilla is often under fire for not allowing a privacy conscious, minimal trust use case.
How do you audit Firefox updates? Because if the answer is “I don’t”, Mozilla already controls the most important piece of userspace code on your computer. And if the answer is “I don’t install them”, then everyone with a few grand to spare already controls the most important piece of userspace code on your computer.
I rely on the Debian system to assist with that. Normandy bypasses that system, if it's enabled. (The jury is out whether it's actually enabled in Debian Firefox ESR.)
What do you think the median size of a Firefox release is, what do you think the resources (let’s call it US dollars FMV) are to audit that, and what do you think the resources Debian has to devote to it?
Clearly more eyes are good, but... In between “Wild West WebExtensions” and “Mozilla backdoors my Firefox and it gets used for nefarious purposes” and “delays in browser updates increase exploitation windows”, I know which threat models I’m buying.
I agree an unpatched vuneribility is probably more risky. However this feature can change settings the user explicitly sets. The bigger issue is it does not give me any indication the settings have been changed.
Yes, that's right, if you install software that had a bug, then if you give someone permission to modify your software, you can get a bug fixes faster.
Why even have an official channel, providing visibility and official oversight, if when it comes down to it, you're just gonna push remote code updates through the same side channel a potential hacker would use?
People are saying it's for convenience. OK, but then they have to understand that doing things in that fashion is a really bad look. And now your users are set up to believe that, at least some of the updates coming from the side channel are "trust"-able.
This is a piece of code downloaded from Mozilla servers to re-enable extensions, which are other pieces of code you download from Mozilla servers. If your threat model includes not trusting Mozilla servers then you've presumably disabled browser extensions (or sideloaded them) and this issue is irrelevant to you. If you do use extensions and get updated versions from Mozilla I don't see any way in which this increases your attack surface.
that would be good news, how can I verify that the Normandy feature isn't available in the Debian build?
Has Mozilla provided instructions to manually fix the issue? if so where? (XORcat was helpful to provide a solution, but I refuse to apply it if it doesn't come from Mozilla itself...)
Thank you for reporting it. Debian's pro-user stance is one of the things I like about it... now tell me they've disabled Pocket too and I might just switch from Arch to Debian Sid.
FYI, I have learned from other user's comments and the Wiki page below that Studies and Normandy are different things. The former depends on the latter, but not vice versa. So it is possible that Debian disabled the studies program but did not disable the underlying Normandy tool. You might also want to look at whether firefox is affected in addition to firefox-esr.
On Windows I have the "Allow Firefox to install and run studies" option disabled and yet in about:config Normandy was still enabled. I haven't received the fix. Could be that Firefox simply hasn't checked for it, or it could be that there's more than that about:config setting that determine whether Normandy is run.
Posting an update for posterity. It appears that even with app.normandy.enabled=default [true], as long as app.shield.optoutstudies.enabled=false, Normandy is disabled. app.shield.optoutstudies is the key controlled by the UI element "Allow Firefox to install and run studies".
I've closed the above bug report as it's not really a bug.
As explained on Normandy's wiki page, they are related but two different things:
> Preference rollout is meant for permanent changes that we are sure of. Shield is meant for testing variations and figuring out what, if anything, is the best thing to do.
Except as we have learned "preference rollout" is also "installing extensions". So this is much the same as studies, but studies was disgraced, so now this is studies 2.0, no option to disable this time around.
And if you look at the big normandy JSON, hey, it's all the same Pocket and heartbeat shit we've seen from studies.
"Explained" is perhaps too generous a word. I'm a software engineer and I found that page to be confusing. It seems to be written for internal Mozilla employees, not for the general public.
As I said elsewhere: There are already channels for bug fixes, and some of the friction on those channels is intentional, such as for visibility and oversight/approval.
>12:50 p.m. UTC / 03:50 a.m. PDT: We rolled-out a fix for release, beta and nightly users. The fix will be automatically applied in the background within the next few hours, you don’t need to take active steps.
>In order to be able to provide this fix on short notice, we are using the Studies system. You can check if you have studies enabled by going to Firefox Preferences -> Privacy & Security -> Allow Firefox to install and run studies.
>You can disable studies again after your add-ons have been re-enabled.
>We are working on a general fix that doesn’t need to rely on this and will keep you updated.
I refuse to enable studies, even temporarily. This comes very close after the IE6 conspiracy revelation, where ends justifies the means.
Please provide a link to the certificate file, and step by step instructions for installing it, without enabling and conflating with mozilla studies...
From the looks, it installs the above plugin, and changes `app.update.lastUpdateTime.xpi-signature-verification` to `1556945257`
I can't get it to work in ESR 60 though. Getting file not found on "resource://gre/modules/addons/XPIDatabase.jsm"
edit: The linked XPI definitely seems to add the new certificate, whatever mechanism used to reverify the signatures just doesn't seem to work in 60.
edit2: Restarting Firefox appears to have forced the reverify... Possibly a flag that I twiddled with though, hard to be sure. Either way, the above should help people get everything running again without having to enable studies/normandy.
Yes, this is broken on ESR, but only somewhat broken.
The hotfix extension does two things:
1) Install a new certificate for "CN=signingca1.addons.mozilla.org/emailAddress=foxsec@mozilla.com", effectively replacing the old certificate that expired. This should work.
2) Then it tries to import the internal "resource://gre/modules/addons/XPIDatabase.jsm" module and calls XPIDatabase.verifySignatures().
This does not work on ESR, as "XPIDatabase.jsm" is a new-ish thing that isn't present in ESR yet. In ESR the function is still in "resource://gre/modules/addons/XPIProvider.jsm" (XPIProvider.verifySignatures()). Thankfully, the non-existing module is imported using ChromeUtils.defineModuleGetter, which only lazily loads the module on first of the imported property, so after the certificate-adding code has run.
that's an interesting question: when we install add-ons or extensions, are these hosted on google servers? I'd rather not have google know what versions of which add-ons I am running...
Unrelated to cert problem: Yes, clicking on the link installs the plugin, but it is suprising to see that firefox claims that it is the news.ycombinator.com, not storage.googleapis.com, that wants to install plugin.
Could it be a security issue since if an attacker somehow manages the post/inject a link for a malicious plugin in a credible site, firefox will claim that plugin is from that site?
You might have to reinstall them unfortunately, on the system I figured that out on Firefox had decided to uninstall them (I think because I had to update the browser from the ancient version the user was using first).
Thanks for the sleuthing, but who does this repository belong to? I'd like to apply it but only if mozilla provides such instruction on their issue page, I don't know who the actual owner of /moz-fx-normandy-prod-addons/ is...
I encourage you to go through the whole Normandy process yourself in a test environment, and even better (if possible), check out the code to see whether it looks legit or benign.
I'm happy, because I went through and checked it out myself without needing to enable Normandy on my actual Firefox, but ultimately, it will be great when Moz can get instructions for manually applying the fix out.
I actually did read that story but I don't understand what that has to do with anything being discussed here.
Yes, Youtube put up a banner asking IE6 users to move to a more modern browser 10 years ago. How is that in any way related to Firefox pushing a hotfix in 2019 to fix a certificate issue? Are you worried there is a big evil conspiracy to use this mechanism to uninstall Internet Explorer from peoples' computers?!
Okay, so, youtube targets a small subset of users, and changes their experience capriciously, and to suit their own purposes.
Firefox, it turns out, has a built-in telemetry system that defaults to enable exactly the same behavior: changing your system, to suit their desires.
You’re words “a big evil conspiracy to use this mechanism to uninstall Internet Explorer from peoples' computer” are misleading. No one would propose that the intent is an attack on Microsoft applications. Rather, the intent is to blindfold users on a whim, should a Firefox component prove inconvenient to the providers of Firefox. Ostensibly, in the event that some add-on or extension threatens the bottom line for major backers of Firefox’s funding.
> Firefox, it turns out, has a built-in telemetry system that defaults to enable exactly the same behavior: changing your system, to suit their desires.
An example of the typical use of this system: say Mozilla wants to enable video hardware acceleration in Firefox but they don't know if bugs in video drivers or in Firefox will make crashing more frequent. So they enable hardware acceleration for 1% of users instead of 100% and compare the reported crash rate between the two to determine if it's ready to be pushed out universally.
At some point in the next five-ten years we will see this "feature" abused. Maybe Mozilla will use it to "soften" commonly used ad blockers to enable "acceptable" ads for Firefox users. Maybe Mozilla will be hacked by some government that wants to enable MITM attacks against its citizens, and Normandy will make that happen. Or maybe Mozilla will just cooperate with the government trying to do so.
You say it is "typically" used for benevolent purposes, but why should we trust Mozilla? Mozilla does not have a stellar history with this sort of thing and in my experience they do not take security as seriously as they should if we are to trust them with such a feature.
The level of paranoia throughout this thread is truly through the roof.....
Mozilla has had several "PR nightmare" decisions that a vocal set of users didn't like, and sometimes were genuinely ill advised/bad/shitty. But as far as I can see they do not have a bad track record when it comes to security/privacy. Do you have any examples of actual serious security/privacy fuck ups by Mozilla/Firefox? I mean that stood up to scrutiny beyond the sensationalist headlines?
Their defaults might not be your defaults, but they are even working on bringing Tor into mainstream Firefox. None of this means they are above criticism of course, but... context!
The sum total of their actions points towards an organisation that has some internal problems but that is genuinely pursuing privacy and an open web as a goal for as many users as possible.
> But as far as I can see they do not have a bad track record when it comes to security/privacy. Do you have any examples of actual serious security/privacy fuck ups by Mozilla/Firefox?
I mean, they are currently shipping real actual ads on the new tab page that aren't blocked by ad blockers - and possibly can't be (there are limits to what WebExtensions can modify on Firefox internal pages). Sure, maybe your parent comment was exaggerating a little bit, but what if Mozilla instead starts inserting "privacy-friendly" "recommendations" into webpages in order to "enhance users' browsing experiences"? That doesn't sound at all far-fetched for the Mozilla we know today.
Besides your claim not being true AFAIK tell [1](there are no ads on my new tab page, and as far as I can tell there was no incident of paid for content showing up on peoples new tab), how exactly would shipping ads be a privacy/security violation?
This is exactly the sensationalist misrepresentation I was talking about. You don't like what they are doing, fine. Misrepresenting it as something that it's not is not fine.
Besides: Mozilla is funded in large parts by having Google as the default search provider. This means they are funded by Google selling ads. Them starting up new revenue streams and getting away from that funding model would be a pro privacy step.
[1] If you are referring to something else that I missed, feel free to enlighten me.
Maybe you've opted out of studies or otherwise disabled Pocket? That's how they're bundling much of this new stuff in.
See: https://help.getpocket.com/article/1142-firefox-new-tab-reco... especially the part that says "From time to time, the occasional sponsored story may appear as a recommendation from Pocket. These stories will always be clearly marked, and you have control over whether they’re shown on your new tab page."
All so-called recommendations I've seen have been spammy, the sort of stuff you see linked as "other articles you may enjoy" when you disable your ad blocker on bad sites. Regardless, this directly contradicts your claim that there haven't been incidents of sponsored content on the new tab page: this is explicitly what is happening according to Pocket's own website. Mozilla themselves explicitly said they are introducing sponsored stories to the new tab page: https://blog.mozilla.org/futurereleases/2018/01/24/update-on...
I think there's a world of difference between making a search engine that sells ads the default, and selling ads yourself and inserting them into the browser's chrome. Among other issues, if I help someone install an ad blocker, that ad blocker will block ads on Google, but will not block ads in the browser chrome.
So, given this and other recent behavior by Mozilla, I have to say I don't think seeing "related stories" inserted into the browser chrome for certain web pages is at all far fetched. That should worry us.
I actually don't see the pocket recommendations on my desktop (maybe the Linux Mint build has them disabled by default), but they are there on mobile. There is a UI setting to disable them of course. It's explained right on the page that you link to.
More importantly, that page also explains that no data gets sent to Mozilla or pocket or anyone else for these ads to show up.
So again, no privacy violation here. I also think it's an extreme leap from "they show this in the new tab page which they design and control" to "they could start showing it overlayed on other peoples content".
I think they got some decisions very wrong. Among them not implementing a way to allow people to override signing of addons, which people did warn about. Having signatures enforced as a strong default is certainly good and right, but if they had included a "right click on addon, use without signature (WARNING THIS IS SKETCHY REAL ADDONS DON'T ASK YOU TO DO THIS)" option this signing issue would have been relatively mild.
But their track record on privacy/security simply isn't as bad as people make it out to be.
This one isn’t very privacy-friendly or open. And that raises all the previous questions again. Should they maybe have learned something about clandestinely fucking with people’s systems?
This is the first I hear about Normandy[1]. Firefox has been my main browser for a long time, only because I could use uBlock origin. Now, all of a sudden that is disabled, and with the recent version they got rid of my ability to always prevent autoplaying of videos.
Apparently, there is no one associated with browsers can be trusted in the least.
In the recent version we added the ability to always prevent autoplaying of videos, in the next version we will be adding further UI to let the user disable all (not just muted) videos from autoplaying - https://bugzilla.mozilla.org/show_bug.cgi?id=1543812
I have spent ~10 years using Firefox daily, tweaking the config and getting the addons set up the way I want. I was a professional web developer for most of those years.
This is the first I have heard of Firefox changing my config settings invisibly in the background. This is obscene. Who on earth thought this was a good idea? The security ramifications are limitless.
I understand all too well that most companies have decided to start A/B testing things on subsets of users, but that doesn't mean you should force that mode of thinking into everything. What a horrible decision. I don't recall ever seeing any news or notifications or checkboxes about studies or "Normandy" at any point.
Are there some other good open source alternatives to Firefox? I remember hearing about Brave but also that it was tied into some cryptocoin nonsense, so I'm not sure what else to look at.
Looking Glass, Pocket, Banning Plugins based on Political ideology, Backdoors like Normandy, and the STUDIES system, their creation of what amounts to Mozilla version of the Ministry of Truth, Their partnership with Cloudflare to send everyone's DNS to Cloudfare over HTTP, and whole host of other things
The way that Firefox needs 5-10 privacy extensions to be usable isn't just inconvenient when the certs fail, but you also have to trust all these strangers and their extension code.
I've been using brave because of that: all of that is baked in so my only extension is my password manager
That is not what I meant by a UI knob, and I sure hope you knew that. By UI knob I mean something easily discoverable and self-explanatory. Rooting around a gated (with a mighty strong warning, I should add) config section for something called "normandy" is not intuitive, and it's not self-explanatory.
And I sure hope that by disclosed to users I did not mean some Hitchhiker's Guide-esque disclaimer on a wiki page. Something as (potentially) insidious as a preferences backdoor should absolutely be disclosed to users with the same level of visibility as the stories nonsense.
Perhaps "normandy" is entirely harmless, but you guys lost a metric fuckton of credibility by using your backdoors to spam people[1]. Playing coy does nothing to improve your credibility or reputation.
> In order to be able to provide this fix on short notice, we are using the Studies system. You can check if you have studies enabled by going to Firefox Preferences -> Privacy & Security -> Allow Firefox to install and run studies.
I happen to be one of the users with Normandy disabled, so I'm foobar'd anyway. That said, the reason I disabled it is because it is a security hole you could drive a semi-truck through. And now they want us to enable it to provide a "fix" for the secure way in?
I thought I was the only one who saw a problem with that. Your post is evidence that I'm not completely off in my thinking.
The studies system is also code-signed, but with a different certificate chain, hence why it wasn't affected. What security hole do you think this opens in Firefox?
If you don't trust your software provider, "studies" don't matter. The same but could come through a regular update. If you don't want to be on bleeding edge, that's fine, and if the UI for Normandy is bad, that's an issue, but it's nonsense to accept updates and then say you don't want updates.
No, it's not. This Normandy nonsense and stories are two separate, yet creepy features. I've already disabled stories but it looks like Mozilla still retains control of my preferences (without disclosing it).
Easy: There's a difference between static, shipped code and a capability to modify software at a distance (which could even by hijacked by an attacker who infiltrates Mozilla's infrastructure.)
If your threat model includes the hijacking of Mozilla's infrastructure, I assume you read and verify the entirety of the Firefox source with every new version before using it, right?
But there are trustworthy people working with and integrating that code, there's a good chance they'll notice a hinky commit, and they're very close to having completely reproducible builds—which means that there can be verification that the shipped binary matches the inspected source.
Because Mozilla is easier to lock down than Chrome.
I guess "easier" isn't the word really, because Chrome can't really ever be locked down. It's pretty much always, effectively, an open book to Google.
You can lock down everything in Firefox. The drawback being, of course, times like this, when you can't get the fix unless you leave Normandy enabled. (Which I didn't.)
Setting preferences really should not be shocking, given that they have the capacity to run automatic updates. I'm more surprised that they can push code without certificates.
> I'm more surprised that they can push code without certificates.
Where are you getting this from? AFAIK all Mozilla code / prefs they can push should be signed -- this very issue seems to stem from the cert used to sign AMO extensions expired.
They are using the Studies system in a complete violation of the way they said they would use the studies system for when it was announced. This is not surprising since Mozilla is becoming about as Trust Worthily as Google or Facebook
>> Is the existence of a back door method of updating Firefox preferences something that will be disclosed to users?
> It will even be documented for them:
That sounds like you do not think the concern is warranted. I've used Firefox since the first time it was available, and Netscape starting with the first ever betas. At no point was there a dialog that said "Do you want us to be able to change your browser settings remotely?"
>> What about a UI knob to disable it?
> app.normandy.enabled
That is not a "UI knob" by any stretch of the imagination. Looking in about:config revealed:
app.normandy.logging.level
Is there a way to find out what is being logged and why?
So, the question can be rephrased as "is the fact that Firefox has been logging all users' entire browsing history despite the fact that the user has not chosen to set up a Firefox account going to be disclosed?"
> So, the question can be rephrased as "is the fact that Firefox has been logging all users' entire browsing history despite the fact that the user has not chosen to set up a Firefox account going to be disclosed?"
Chill out, this preference only determines what is logged locally (never sent to the server). It's a debugging tool.
Look, at this point it’s not the user’s responsibility to “chill out”. It’s very much Firefox’s responsibility to try to repair their reputation by:
1. being completely transparent about all the mechanisms that data or code can be pushed to or pulled by the browser, or pushed from or pulled from the browser; and
2. having a toggle for all of them, yes every single one, in Privacy & Security.
> Normandy Pref Rollout is a feature that allows Mozilla to change the default value of a preference for a targeted set of users, without deploying an update to Firefox.
Rolling out a new certificate goes beyond changing the default value of a preference which rightly raises questions about what else Normandy allows which is not documented.
I've deleted my extensions thinking it was the extensions' issue. I'm trying to download again but it's telling me I don't have internet connection. Any work-arounds?
No offense but they're not getting inhumane shock treament either if you're going to pull "our stress is holier than thou" and say it's magnitudes higher, I'll be waiting scientific backing on this or else it's just rude... plus they can always just not make me periodically re-install all my addons with no option to just bypass verification... except this time it doesn't work even with the fixes. (Actually, one time I just didn't see it was set to update automatically downgrading one of them. I'm no expert on these issues, really. They just could have asked first in my opinion.) Thanks, Mozillama.
Edit: I had to click "Restart with addons disabled (safe mode)" for those wondering.
It depends on the permissions: uBlock has permissions to read the entire page on any domain. Some extensions only limit themselves to specific domains, or don't touch that at all.
Meanwhile, as the "partial fix" is deployed, here is what I think fixed the issue for me: in the Preferences, under "Privacy & Security", check "Allow Firefox to install and run studies" (then wait for the current hotfixes to appear in about:studies then restart the browser).
I still get “Download failed. Please check your connection” when attempting to re-download the add-on you took away without asking me. That thing was pretty much the only reason I used Firefox at all.
The wiki entry evidently doesn't describe what it does because according to the wiki entry it allows for the enabling and disabling of preferences. The updating of a certificate is beyond what is described in the wiki.
Mozilla should follow up with a post describing exactly how Normandy works and the full capabilities it gives them.
From what I understand, Normandy is an infrastructure for delivery of some changes to some of Firefox users (or all of them). There are two major use cases: preferences rollout and studies. In the first case default values of preferences get changed (if your pref has non-default value, it won't affect you). In case of studies some piece of code gets delivered and executed, which cat do anything. In this hotfix the study installs add-on, which in turn installs certificate.
Well that's interesting. I see Normandy enabled, but if I go to the "Privacy and Security" section of the preferences page I see all the data collection and use stuff disabled. There's no obvious way to disable the Normandy back door.
Oh well, at least we don't have another season of Mr Robot spam to look forward to.
With an obscure name and no correlation to all the other spying and backdoor ING Mozilla are doing. Is this really the best option tog etaprivacy focused browser? I think this is all very worrying.
Spying was the wrong word. But yes, the telemetry. The google analytics that are hidden on the extensions page, that only listen to the Do not track, but not the turn off telemetry checkbox. Sadly it just doesn't seem to stop.
Thanks! Do you know how many active users were affected by this certificate error and subsequent addon disabling? I guess most users were spared due to the timing and short duration of the error.
Their system runs a check every 24 hours. So a lot of users were affected. They've rolled out a partial fix via their Normandy thingy, and according to them it will fix the issue for most people over the next couple of hours.
What bothers me is the fact that everyone seems to be accepting the usage of these newspeakish romantically sounding euphemisms that are completely opaque about the purpose of things they stand for. "Normandy"... What is this - a place, a sort of champagne, a hotel or the internal name of some freaking area51 document?..
Like, "In order to disable Normandy, uncheck Vaduz, Monterrey, Vologda and select Newcastle in the Xinjang drop-down menu - we'll ship Bronx with next update".
Just discovered the same message in the Tor browser, and it seems that NoScript got disabled. So people running Tor are a lot more vulnerable right now.
Also, wow, the web has a ton of ads. I've been running uBlock origin so long I forgot how bad it had gotten :(
Considering that JavaScript has been used in the past to unmask Tor users, this is a frightening security bug and is not "fail-safe" behavior. The extension should remain enabled, but with a warning.
It is doubtful that Mozilla will change this behavior, as they will likely consider it a niche case, but the Tor browser should probably look into alternate means of changing the behavior (patching).
That really sucks that Tor was vulnerable to this too.
Tor needs to fork Firefox "properly" and remove all of Mozilla's bullshit like I've seen some forks do (Waterfox?). I thought this would be common sense for the people at Tor.
The more people who use adblocker, the more ads websites need to make the same amount of money. It's been brought up that many twitch streamers don't receive ad revenue from more than half their viewers.
I can only find a source right now for YouTube, but they're out there for twitch too.
And? If ads were ads and not malware trackers I wouldn't care. Calling these franken-programs ads stretches the word past its breaking point.
No one can stop actual ads - this comment was brought to you by Pepsi, Pepsi for the love of it. See? Everyone had to read the last sentence even if they had adblock on.
To add to the specific example of twitch, their ads are broken and annoying as hell.
The broken:
- they still don't have the volume of ads under control
- the android app regularly freezes during ad display
- sometimes it disrupts and buffers the stream without then displaying the actual ad
And possibly more, I wouldn't know since all these are enough to make me either not watch twitch or block ads. I disable it once every few months to see if it got better though.
The annoying:
- the same ad every time often (when The Grand Tour started again this year, it was the only ad that ever played for me)
- most ads seem to be trailers for TV shows or movies. Most of those spoil half the story
- if you just want to see what some streamer is doing you have to watch an ad first
Twitch Prime was the only reason I still had Amazon Prime when it removed ads officially. Not anymore.
Twitch turbo was great before Twitch Prime and I had it. But now it's 9,99€ per month which I find outrageous, especially because the streamers will see very little of this money anyways, afaik.
> Also, wow, the web has a ton of ads. I've been running uBlock origin so long I forgot how bad it had gotten :(
Try turning it off. I got rid of ublock after arstechnica complained about a lot of their users blocking ads years ago and it honestly isn't that bad. Every once in a while I do back out of a page for maxing out one of my cpu cores but otherwise, nothing ever bad happens. With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.
The alternative is websites charging insane amounts of money with paywalls (Wall street journal has their "best" price for 12 months at $360 a year). That is horrible because it means only rich people can pay for high quality news as ads are one of the most progressive forms of payment (rich people ads are way more valuable than poor peoples and yet everyone gets the same quality services/news with the ad model despite their income/net worth).
The alternative is those websites not using third party ads with third party trackers on it. Adblockers already do not block those (cause they're indistinguishable from image links). If they really just want my eyeballs they know how they can get them.
But they really want to track me. And I'm not having that. The moment they stop tracking their users through third party ad networks, most adblockers stop blocking (because there's no AI involved and they wouldn't know what to block except images in general).
It's in their hands, really. If they want to show me ads they can do it in a normal and decent manner.
News websites should in fact be the first to adapt this model, because it's exactly the same thing as ads in print media. But they chose to get those disgusting third party tracking networks involved. And not just one or two.
I don't have to put up with that, but I really don't see why there would be an action required on my site to stop blocking those tracking ads.
Just FYI, that's not really true. Adblocker use mostly all the same filter lists and those do regularly block ads that just are regular images, and even text notes. https://www.troyhunt.com/ad-blockers-are-part-of-the-problem... is an example, even if that specific one got resolved
Adblock Plus has the ability to not block ads that conform to a certain standard, but in addition to conform to standards ad publishers need to pay for that. At least that's what they claim.
"Adblock Plus has the ability to not block ads that conform to a certain standard"
Not my standard. Ad blockers should be rebranded as "tracking blockers" so everyone calls them that. Then sites would have to ask you to "disable your tracking blocker", which sounds scary as hell to users, as it should.
Even my example I linked in the comment you answered to is not about an image hosted on a different server. It's not an image at all. And ublock origin even still blocks it today.
It’s the sheer number of trackers online that really pushes me to use lots of security extensions. I really can’t support that kind of malicious behavior.
> With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.
If ads weren't doubling as tracking beacons and the occasional malicious drive by download, that certainly would be an option.
This has Nothing to do with "Ads". It has to do with malicious scripts and gratuitous webtrash that sucks up resources.
The instant Mozilla turned off my "Noscript", I got one of those phishing popups that pretends to be from Microsuck and totally locks up Firefux.
On a machine with limited memory, cores, or what-have-you, every webprogrammer's special cute little "Animation" will run, gratuitously, and slow your machine down so much that it becomes unusable.
Maybe "Advertisers" need to finger out how to write adaptive code that doesn't depend on cutesy little videos that choke older systems to death. (I notice Amazon has done that... you can run it on nearly anything. Which is why... oh, never mind.)
>nothing ever bad happens. With ads: either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website.
>Assuming you mean that half second looking at the ad: Name a better alternative for funding the internet. Paywalls at every website?
Funding the Internet? What you're talking about (ads) is a revenue stream for what amounts to a handful of websites. google.com, amazon.com, ycombinator.com, reddit.com, thefacebook.com, tweeter.com, etc. could all go offline right now and the Internet would still be here.
That doesn't sound right. What about all the other websites with ads, like recipe sites, guitar chords, porn, diy, etc.? or apps on the Google play store with ads?
I run sites that don't have ads. I don't make any money off of them. I still run them. Seems like a lot of people in software development think similarly.
This is the web that I like. Hobbyists and volunteers running low-fi websites for common interests. I'm not against commercial sites like Netflix but don't think every last blog should be monetised.
How do you pay your bills? If running those websites were your full time job, would you still be okay not making any money off of them? Or have you just decided that only people who have other income should have websites?
I don't understand your question; what about them? The websites are just nodes of the Internet. And I don't understand at all why you brought up Google app store apps, so I'll refrain from commenting on that until I better understand your point.
It doesn't feel like a handful of websites. It feels like the dominant experience of the internet for most people. Ads are a source of revenue for many more websites than just a handful. They are also a source of revenue for more than a handful of apps.
I
The person who I am responding to said the following describes something bad:
> either it takes me half a second to tell I'm not interested in an ad, or I actually am interested and i follow the ad because I am interested and I want to support the website
The second half of that sentence is precisely what I'm describing. Do you disagree with my characterization of that sentence?
I assume they included that part in the quote rather than cutting it off earlier because this was part of what they were saying is bad. Do you disagree with me there?
However, if you're running the Stable or Beta version, it will only work under Linux. On Windows and MacOS you'll need to download Nightly or the Developer Edition.
To fix this on MacOS I did the following:
1. Downloaded and installed Firefox Nightly
2. Ran /Applications/Firefox\ Nightly.app/Contents/MacOS/firefox-bin --profilemanager
3. Changed the profile to "default" so my normal Firefox profile would be used
4. Started up Firefox Nightly, opened about:config, then set xpinstall.signatures.required to false
Not sure if it's a good idea to use my default profile in Nightly. It might be a wiser idea to copy it instead.