October 31, 2011

Memory comes to the fore in Firefox 7

Performance and improved memory use were the goals of Firefox 7, which arrived on schedule today from Mozilla. Firefox 7 is available to download for Windows, Mac, Linux, and Android.

The new Telemetry feature asks you to opt in before it will collect your data.
(Credit: Mozilla)

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


The wide-release version of Firefox 7 brings changes to the majority of Firefox users that the beta and Aurora channels have been playing with for some time. These include claims of significant gains in reducing memory use, "often 20 percent to 30 percent less, and sometimes as much as 50 percent less," a company representative wrote in a blog post based on work by Firefox developer Nicholas Nethercote.

These performance gains are the first public results of an internal Mozilla project called MemShrink, which, as the name implies, is about reducing the browser's system impact. Mozilla cited several specific areas of improvement in Firefox 7, including when the browser is kept open for long periods of time, when multiple tabs are open at once, and when the browser is used concurrently with other programs that also use a lot of memory. The company also noted that MemShrink was successful in part because of the rapid-release cycle that a vocal minority of Firefox users have been criticizing.

The spotlight on performance is something that Mozilla clearly plans to keep lit. When you install Firefox 7, you'll be prompted to opt in to a new anonymous-reporting measure that the company is calling Telemetry. Not unlike security suites that use your data anonymously to improve threat detection rates, Mozilla plans to crowd-source its performance data to learn more about how the browser performs in real-world situations.

Unlike the security suites, Telemetry is an opt-in reporting system, so Mozilla won't be collecting data without permission. Lead privacy engineer Sid Stamm addresses security concerns in a blog post, but the short version is that Mozilla is far more open about the data it collects--and why it collects that data--than competitors such as Apple, Google, and Microsoft. Currently, Telemetry looks at four categories: memory usage, CPU core count, cycle collection times, and startup speed. Curious readers can install the about:telemetry add-on to see the personal statistics Firefox is gathering.

If you've enabled Telemetry and would like to disable it, you can go to Options, Advanced, and uncheck the Submit Performance Data box at the bottom of the General ta
Other changes in Firefox 7 for desktops include a new version of hardware-accelerated Canvas for faster HTML5 games and animations, and improvements for Web developers. These include support for the W3C navigation timing API, which allows developers to measure page load time and site navigation against factors like bandwidth, and a new set of Firefox tools for developers.

Firefox 7 for Android includes a laundry list of changes, including the ability to select text in a Web page for copying and pasting. Long-tap on a site, and the Android-style drag handles will appear. There's a new Quit feature under Preferences/More to force an exit from the browser, the WebSocket API now works on Firefox for mobile devices, and image rendering has been improved on Tegra-powered tablets and phones. The browser also now auto-detects your system default language if it's supported, and a new Preferences option enables you to change the language displayed in the browser on-demand.

Facebook: 'Open hardware' integral to green IT infrastructure

Open Compute Project strives to model itself after the Apache Software Foundation, with technical contributions vetted by community members.

Is the secret to building greener, more energy-efficient data centers using “open hardware” optimized for that purpose?

Best CCNA Training and CCNA Certification and more Cisco exams log in to Certkingdom.com



Facebook and its allies in the Open Compute Project certainly would have you believe that this is so. Right now, most server hardware vendors invest in “gratuitous differentiation” and not true innovation, according to Andy Bechtolsheim, the chief development officer for Arista Networks and co-founder of Sun Microsystems who is one of the Open Compute Project’s five newly named board members.

“What has been missing is standards at the systems level,” Bechtolsheim told attendees of the Open Compute summit held this week in New York.

If you step back and look at what Facebook was able to accomplish at its Prinville, Ore., data center, the evidence certainly suggests that Internet service providers or those building cloud infrastructure would do well to embrace some of this philosophy.

Facebook built its own servers for that facility, adopted a new power distribution design, and approached each rack holistically in order to create drive efficiencies. That facility can run work loads using up to 38 percent less energy than its counterparts at a 24 percent cost reduction, according to the Open Compute Web page. “This isn’t just important for the environment, it is green for the bottom line,” said Frank Frankovsky, director of technical operations for Facebook and chairman for the Open Compute Project.

Let’s be clear. Open Compute Project isn’t a standards organization. It is modeled after the open source software movement, which Frankovsky and other Open Compute Project members suggest has spurred innovation that has far outpaced true advances in the software world. Community members are being encouraged to contributed designs and architectural best practices. For example, ASUS has submitted motherboard specifications while Facebook is opening up its OpenRack specifications.

The other thing to keep in mind is that there is a lot more to a data center than just the information technology housed within.

In his presentation during the Open Compute Summit, James Hamilton, vice president and distinguished engineer with Amazon Web Services, pointed out that there has been more data center innovation in the past five years than in the past 15 years — inspired by the challenges of scale commputing.

The cost of infrastructure directly impacts service costs, so efficiency is paramount. During his talk, here are several issues that Hamilton discussed:

Virtualization: Intuitively, current best practices suggest that less is more in the data center. But Hamilton suggests that data center managers think twice about turning off a server just to save power. “Any workload that is worth more than the marginal cost of power is worth it,” he said.
Power distribution: Up to 11 percent of the power that heads into a data center is typically lost through conversions and other legacy design issues in the grid. Any conversions that can be eliminated along the way, SHOULD be eliminated. UPS technology, he suggests, is in for an overhaul. (Facebook, as an example, has redesigned the way it includes back-up power in its racks.)
Temperatures must rise: Even though most managers run their data centers at 77 degrees Fahrenheit today, most systems can tolerate much higher temperatures. If everyone raises the temperature AND talks about the results, temperatures will rise across the sector.
Use outside air 100 percent of the time for cooling, period.
Look to water cooling technologies, including evaporative cooling methods.
Think modular: Regardless of the specific servers they use, the most efficient data centers in the world all use modular architectures that use some sort of outside cooling mechanism. That includes Microsoft, Facebook and Amazon.

The cynic in me believes that the Open Compute movement has as long uphill battle to face in existing data center environments. Yet, as the industry moves to scale computing architectures that can support cloud-delivered infrastructure services and applications, it is clear that the hardware world may be holding things back. That, in itself, is a reason to keep a close eye on Open Compute Project development.

Why Microsoft's vision of the future will really happen

Two videos from Microsoft show the future of technology. Here's why I think they're dead-on

Computerworld - Microsoft released a video in 2008 and another one this week that together predict the sleek, wireless, connected gadgets we'll all enjoy by the year 2019.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com



In one scene, two businesspeople each place a smart object on a smart table -- a keychain fob and a flat phone or smartcard of some kind. From these devices, out spills their data, which can be manipulated on the table. The same thing happens at home, where a girl's homework spills out onto the kitchen table, and cookbook instructions spill out onto the kitchen counter.

Data and documents can apparently be transferred from anything to anything else. One business-related example involves a drag-and-drop gesture from a desktop to a mobile device. In another scene, that same mobile device becomes a virtual keyboard for a desktop computer the user happens to be sitting at.

Another example shows a man "capturing" with a kind of take-a-picture gesture using a clear-glass remote control then moving data from a wall-mounted device and dumping it out onto his e-newspaper.

Videoconferencing has been perfected. What looks like a glass window into another classroom is actually a live, big-screen video chat connecting schools in India and Australia. In one scene, two children interact with each other, each speaking a different language instantly translated with cartoon-like speech bubbles.

Intelligent agents pay attention to what's going on. The kids fingerpaint a dog onscreen, and the computer recognizes the image and animates it accordingly.

One very cool and versatile device shown in the video is a smartphone, a card-like gadget so thin that a woman uses it as a book marker. The card functions as a boarding pass, an airport map, a calendar, an augmented reality window, a 3D holographic display and more.

The phone splits into two halves about the size of playing cards, with one "card" displaying live video and the other held up to the ear for videoconferencing on the go. It even projects some kind of laser beam arrow on the ground, telling Mr. Future Businessman where to go.

Everything is connected to everything. Intelligent agents make decisions about when to inform the user about relevant data.
Why these are great predictions

Everything in this video is being worked on, refined and developed. If you follow current trends for compute power, display technology, networking speeds, device miniaturization, flexible displays, touchscreens, gesture technologies and others, you get this Microsoft future.

And Microsoft itself is working on much of this. The intelligent displays are really just advanced versions of what's possible now with a Microsoft Surface table. The in-air gestures are advanced versions of what Kinect for Xbox 360 users are already doing.

Industrywide, displays are getting bigger while devices are getting thinner and lighter. Companies have already developed versions of clear displays, augmented reality systems and all the rest.

The past four years have ushered in thin multitouch tablets supporting gestures and intelligent agent voice technology.

Although breathtaking to look at and consider, everything in Microsoft's videos are fairly conservative predictions based on existing products or technology actively being developed.
Why Microsoft won't build it

There tends to be little connection between companies that envision the future clearly and those who build it.
Bookmark and Share