User:Csreis
Contents
Making Firefox More Robust
Some of the comments on the Firefox/Feature_Brainstorming page have touched on the fact that Firefox lets pages interfere with each other in bad ways. This page looks at how to improve Firefox's robustness by putting pages from different domains in different processes.
Current Robustness Problems
The brainstorming page covers a lot of these problems, but I'll mention them here for completeness.
- Concurrency - JavaScript execution in Firefox is not interleaved across different sites. If one site has a script that performs a long computation (either maliciously or because of a bug), the user cannot interact with other pages until either (1) the script finishes, or (2) Firefox pops up a dialog that says the script is taking a long time. Malicious pages can use easy tricks to avoid this dialog and slow everything in the browser down.
- Memory Management - Sites can allocate as much memory as they want, until the entire browser becomes unresponsive. It can get to the point that the user has to kill Firefox, and all open pages go with it. The brainstorming page lists several variations on this.
- Failure Isolation - If one page crashes, all pages crash with it. This could be the result of a bug in Gecko or any plugin, like Applets, Flash, or PDF viewers.
Ways to fix these problems
The concurrency problem could be solved by using multiple threads, but this doesn't prevent a crash on one page from taking down all other pages.
Instead, OS processes offer enough isolation to solve these problems, and they correspond to the idea of having different "web applications," just like desktop applications. Each process is scheduled by the OS, has its own address space, and can't interfere with other running processes.
There are two challenges for using multiple processes in Firefox. The first one is picking the right level of granularity, because some web pages can communicate with each other via the DOM. Putting these pages in separate processes would break this feature.
The second challenge is that Firefox profile data cannot currently be shared between processes. I won't get into that here; just look at bug #135137.
Where to Introduce Processes
We can look at different approaches to decide the best way to introduce multiple processes, without breaking existing pages. No current browsers get this right.
- Process per Browser - Some browsers just put all pages in the same process, like Safari and Opera. In this approach, a crash on one page or plugin takes down all pages. Firefox sort of fits in this category, though you can run different Firefox profiles in different processes. (These profiles can't share bookmarks, preferences, etc, so this isn't a great solution.) The diagram below shows this, using a gray box to represent a process. "Child" pages (e.g., new windows or frames opened by a page) are shown below the page, and arrows represent when pages can talk to each other over JavaScript.
- Process per Group of Windows - Internet Explorer and Konqueror start a new process each time you start the program from its icon. However, they use the same process for pages in different tabs, or when the user chooses "New Window." There's no visual indication of which windows belong to which process, and a crash still takes down all pages in a given group of windows.
- Process per Page - Putting each page in its own process would prevent all pages from interfering with each other, but it also breaks any pages that communicate using JavaScript and the DOM. Here, a "page" is any HTML document, whether it is shown in a window, tab, or embedded frame.
- Process per Origin - The browser has a "same-origin" security policy, which prevents pages from different origins from communicating via the DOM or JavaScript. Two origins are the same if they have the same protocol (e.g., http, https), same port, and exact same domain (e.g., maps.google.com and mail.google.com are different). There's an exception to this: browsers also allow pages to talk to pages from suffixes of their own domain, if they modify their "document.domain" variable. (For example, store.company.com can then talk to company.com.) Putting all pages from the same origin in the same process would break pages that use this feature (which includes many prominent sites).
- Process per Domain - Instead of separating sub-domains like store.company.com and company.com, we can put all pages from *.company.com in the same process. This makes a domain responsible for the all the pages it delivers, while preventing those pages from interfering with pages from other domains. This approach doesn't break existing pages, and it could support a management feature like "kill all pages from somebadsite.com".
Exceptions
Note that Mozilla-based browsers support signed scripts, which let a script ask the user for permission to talk to pages from different domains. (In the diagram above, this would add an arrow between D.com and E.com.) However, I haven't found any popular sites using this feature, and I'm not sure it's worth letting pages from different domains interfere with each other. Correct me if I'm wrong.
Other browsers like Opera support HTTP 5 message passing, which allows two pages from different domains to pass messages using a "postMessage" function call. This would also add an arrow between D.com and E.com above. This is a lot closer to inter-process calls than shared memory, though, so it could still be implemented even if the pages are in different processes.
Performance
Introducing more processes will inevitably add memory overhead, but not as much as one might expect. Shared libraries cut down on the actual amount of memory needed for each extra process. Also, this approach greatly improves the responsiveness of each page when one bad page is doing something expensive.
While many people have requested making Firefox faster on the feature brainstorming page, making it more robust could actually set it apart from other browsers. It's worth taking this into account.
Prototype
I've already started building a prototype of a web browser that uses a process per domain. However, because Firefox profiles cannot be shared across processes, I'm modifying Konqueror instead of Firefox.
To do this, I'm using XParts to embed a KHTMLPart from one process in a Konqueror window in a different process, which is working pretty well so far. I'm currently working on the process management code, to map each window/tab/frame to the correct process, and to decide when a process can be safely killed (i.e., when all pages from a domain leave the browser history).
If we want to fix these robustness problems for Firefox 3, though, we need to first to fix bug #135137 (to share profile data across processes).
Summary
We can signficantly improve the robustness of Firefox without breaking existing pages, by putting pages from different domains in different processes. This corresponds to the idea that Firefox is running different JavaScript programs from different domains, so it should separate them the way an OS does.
Discussion
Feel free to discuss this below, or on the Talk page for this page.
About Me
Charlie Reis, University of Washington
creis [at] u [dot] washington [dot] edu