Very few people have stepped inside Google’s data centers, and for good reason: our first priority is the privacy and security of your data, and we go to great lengths to protect it, keeping our sites under close guard. While we’ve shared many of our designs and best practices, and we’ve been publishing our efficiency data since 2008, only a small set of employees have access to the server floor itself.

Today, for the first time, you can see inside our data centers and pay them a virtual visit. On Where the Internet lives, our new site featuring beautiful photographs by Connie Zhou, you’ll get a never-before-seen look at the technology, the people and the places that keep Google running. The site is available in English, Italian, Spanish, French, Finnish, and Dutch.

In addition, you can now explore our Lenoir, NC data center at your own pace in Street View. Walk in the front door, head up the stairs, turn right at the ping-pong table and head down the hall to the data center floor. Or take a stroll around the exterior of the facility to see our energy-efficient cooling infrastructure. You can also watch a video tour to learn more about what you're viewing in Street View and see some of our equipment in action.

Finally, we invited author and WIRED reporter Steven Levy to talk to the architects of our infrastructure and get an unprecedented look at its inner workings. His new story is an exploration of the history and evolution of our infrastructure, with a first-time-ever report from the floor of a Google data center.

Fourteen years ago, back when Google was a student research project, Larry and Sergey powered their new search engine using a few cheap, off-the-shelf servers stacked in creative ways. We’ve grown a bit since then, and we hope you enjoy this glimpse at what we’ve built. In the coming days we’ll share a series of posts on the Google Green Blog that explore some of the photographs in more detail, so stay tuned for more!


Last year we published Google’s 2010 carbon footprint data for the first time on our Google Green site, and today we’re updating the site with information about our 2011 footprint. We’re also thrilled to report that we’re featured in the Carbon Disclosure Project’s 2012 Carbon Disclosure Leadership Index for a second year.

As we grow our services, we’re doing so in a responsible way. The Internet continues to see explosive growth: we’ve found over 30 trillion unique URLs on the web, up from 1 trillion in 2008. Our servers index 20 billion pages a day, receive 100 billion search queries a month, and support 425 million Gmail users—among many other services. Because we’re carbon neutral, we do all that work with a carbon footprint of zero, minimizing our impact on climate change.

We like to be thorough, so when calculating our carbon footprint we go beyond the typical approach. We cover not only employee business travel, but also daily commuting. We track the miles driven by our self-driving cars and Street View vehicles. And when it comes to our data centers, we take a “kitchen sink” approach—that is, we throw just about everything in. We include data center construction and server manufacturing as well as the energy used by Google-built data centers, leased facilities (called “colos”) and other third-party facilities around the world that house Google equipment. 

Our carbon footprint in 2011, before offsetting it, was 1,677,423 metric tons CO2e. As a normal result of continuing to provide more and better services to more users, our energy consumption in 2011 increased in absolute terms, but not in relative terms. In other words, it’s growing less quickly than our business. Our carbon footprint per million dollars of revenue—a measure of carbon intensity commonly used to track corporate sustainability—has decreased by an average of 10% each year since 2009.

Our data centers hit a new low this year—in a good way. The average power usage effectiveness (PUE) across our global fleet of data centers has dropped to our lowest (AKA best) yet: 1.13 (with some facilities dipping as low as 1.08 earlier this year). This means we’ve managed to reduce the amount of energy we use on cooling and other overhead to just 13%.

Our campus sustainability programs are thriving. Last year, our shuttle program saw a 60% increase in ridership, and Googlers drove our fleet of hybrid and electric vehicles over 220,000 miles. The combination of our employee shuttle system and our electric vehicle infrastructure takes the equivalent of about 3,000 cars off the road every year. And we’re proud to have over 6 million square feet of building space around the world set to achieve LEED certification.

We continue to look for and implement new ways to reduce our impact on the environment while we increase our impact on sustainability, green energy, and clean technologies. We look forward to reporting back next year on our latest numbers, innovations, and activities.


This week, in partnership with the Tate Modern in London, we released an online art experiment called This Exquisite Forest, which lets you collaborate with others to create animations and stories using a web-based drawing tool.

Seven renowned artists from Tate’s collection, including Bill Woodrow, Dryden Goodwin, Julian Opie, Mark Titchner, Miroslaw Balka, Olafur Eliasson and Raqib Shaw, have created short “seed” animations. From these seeds, anyone can add new animations that extend the story or branch it in a new direction. Or you can start a tree of your own with some friends. As more sequences are added, the animations grow into trees, creating a potentially infinite number of possible endings to each animation.

In addition to the website, an interactive installation will open on July 23 in the Level 3 gallery of Tate Modern. Trees seeded by Tate artists—and the contributions from the public—will be on display as large-scale projections. Gallery visitors may also contribute using digital drawing stations.

This Exquisite Forest uses several of Google Chrome’s advanced HTML5 and JavaScript features to produce a unique content creation and exploration experience. For example, the Web Audio API makes it possible for contributors to generate music to accompany their submissions. The project also runs on Google App Engine and Google Cloud Storage.

Please try it out at and contribute your own animation to help the forest grow.


 Inspiration comes in many forms and can influence you in unexpected ways. I can trace my own interest in programming to Babbage’s Analytical Engine, which fascinated me on my childhood visits to the Science Museum in London. This idea that science and technology can inspire people is one that we hold close to our hearts.

It’s also the thought behind a new exhibition we’re launching today online and at the Science Museum in London. We hope to inspire people around the world by showcasing the magic that the Internet makes possible.

Launching in beta, Web Lab is a set of five physical installations housed in the Science Museum in London. You can interact with them in person at the museum, or from anywhere in the world at

By opening up the museum experience to the world online, Web Lab doesn’t play by the usual rules—a visitor’s location and museum opening hours no longer matter. Each of the five experiments—Universal Orchestra, Data Tracer, Sketchbots, Teleporter and Lab Tag Explorer—showcases a modern web technology found in Chrome to explore a particular theme in computer science.

For example, the Universal Orchestra experiment uses WebSockets to demonstrate real time collaboration as people from around the world make music together on custom-built robotic instruments housed in the Science Museum. Please join us online or at the Science Museum in London (entry is free), and let us know what you think. True to its name, the year-long exhibition is a working lab, and we’ll continue to tinker with it based on your feedback. Here’s to the next wave of Internet invention!


At Google, we’re obsessed with building energy efficient datacentres that enable cloud computing. Besides helping you be more productive, cloud-based services like Google Apps can reduce energy use, lower carbon emissions and save you money in the process. Last year, we crunched the numbers and found that Gmail is up to 80 times more energy-efficient than running traditional in-house email.  

We’ve sharpened our pencils again to see how Google Apps as a whole - documents, spreadsheets, email and other applications - stacks up against the standard model of locally hosted services. Our results show that a typical organisation can achieve energy savings of about 65-85% by migrating to Google Apps.

Lower energy use results in less carbon pollution and more energy saved for organisations. That’s what happened at the U.S. General Services Administration (GSA), which recently switched its approximately 17,000 users to Google Apps for Government. We found that the GSA was able to reduce server energy consumption by nearly 90% and carbon emissions by 85%. That means the GSA will save an estimated $285,000 annually on energy costs alone, a 93% cost reduction.

How is the cloud so energy efficient? It’s all about reducing energy use for servers and server cooling. Here’s how it works:

A typical organisation has a lot more servers than it needs—for backup, failures and spikes in demand for computing. Cloud-based service providers like Google aggregate demand across thousands of people, substantially increasing how much servers are utilised. And our datacentres use equipment and software specially designed to minimise energy use. The cloud can do the same work much more efficiently than locally hosted servers.

In fact, according to a study by the Carbon Disclosure Project, by migrating to the cloud, companies with over $1 billion in revenues in the U.S. and Europe could achieve substantial reductions in energy costs and carbon emissions by 2020:

  • US companies could save $12.3 billion and up to 85.7 million metric tonnes of CO2
  • UK companies would save £1.2 billion and more than 9.2 million metric tonnes of CO2
  • French companies could save nearly €700 million and 1.2 million metric tonnes of CO2

We’ve built efficient datacentres around the world, even designing them in ways that make the best use of the natural environment, and we continue working to improve their performance. We think using the super-efficient cloud to deliver services like Google Apps can be part of the solution towards a more energy efficient future.


In May, the European Commission launched a Public Consultation on cloud computing to collect stakeholders’ input on opportunities and barriers to the adoption of cloud computing. Ms. Neelie Kroes, the Vice President of the European Commission and European Digital Agenda Commissioner, summarised Europe’s ambition quite well when she declared in a recent speech, that “the goal is to make Europe not just cloud-friendly but also cloud-active.”

At Google, we fully support the European Commission’s efforts in this area. Cloud computing is gaining traction in Europe and elsewhere. The cloud saves users money and it creates jobs. According to a recent study from Professor Federico Etro of the University of Venice, cloud computing in the EU will contribute 0.4% of GDP and create a million jobs by 2016. Similarly, in the United States Vivek Kundra, previously the Obama administration’s Chief Information Officer, recently pointed out in the New York Times that U.S. government agencies can gain significant economic benefits by moving their IT services to the cloud. And, as we said in July, the United States has reached out to industry for input. We’re thrilled that governments in Europe and the United States are so enthusiastic about cloud computing.

For these reasons, Google has submitted its contribution to this important debate in Europe. In particular, we have provided our point of view on what we consider key issues, namely:
  • The legislative framework: We suggest proposals to facilitate cloud adoption and to remove the legislative and administrative barriers service providers are facing in Europe, and still preserve consumer values and data protection.
  • Embracing interoperability and data portability: Google has put a lot of effort into tools and solutions aimed at giving users control over their data in the cloud and making data genuinely portable.
  • Public sector clouds: the public sector should lead by example in important fields like security and procurement, at the EU, national and local levels (similar to the “Cloud First” strategy in the United States).
  • Global solutions for global problems: one of the advantages of the cloud is scalability, which needs to be fostered by setting global standards, in particular in the areas of data protection and security.
At the end of the day, the European Commission has a great opportunity to come up with a proposal that modernises the EU legislative framework and especially the EU data protection regime. The cloud offers the possibility to truly leverage the digital single market to the benefit of all Europeans, both users and providers, and we at Google hope our proposals will help the Commission take the right steps going forward.

(Cross-posted from the Official Google Blog)

We’ve worked hard to reduce the amount of energy our services use. In fact, to provide you with Google products for a month—not just search, but Google+, Gmail, YouTube and everything else we have to offer—our servers use less energy per user than a light left on for three hours. And, because we’ve been a carbon-neutral company since 2007, even that small amount of energy is offset completely, so the carbon footprint of your life on Google is zero.

We’ve learned a lot in the process of reducing our environmental impact, so we’ve added a new section called “The Big Picture” to our Google Green site with numbers on our annual energy use and carbon footprint.

We started the process of getting to zero by making sure our operations use as little energy as possible. For the last decade, energy use has been an obsession. We’ve designed and built some of the most efficient servers and data centers in the world—using half the electricity of a typical data center. Our newest facility in Hamina, Finland, opening this weekend, uses a unique seawater cooling system that requires very little electricity.

Whenever possible, we use renewable energy. We have a large solar panel installation at our Mountain View campus, and we’ve purchased the output of two wind farms to power our data centers. For the greenhouse gas emissions we can’t eliminate, we purchase high-quality carbon offsets.

But we’re not stopping there. By investing hundreds of millions of dollars in renewable energy projects and companies, we’re helping to create 1.7 GW of renewable power. That’s the same amount of energy used to power over 350,000 homes, and far more than what our operations consume.

Finally, our products can help people reduce their own carbon footprints. The study (PDF) we released yesterday on Gmail is just one example of how cloud-based services can be much more energy efficient than locally hosted services helping businesses cut their electricity bills.

Visit our Google Green site to find out more.

(Cross-posted from the Official Google Blog)

Cloud computing is secure, simple, keeps you productive and saves you money. But the cloud can also save energy. A recent report by the Carbon Disclosure Project (CDP) and Verdantix estimates that cloud computing has the potential to reduce global carbon emissions by millions of metric tons. And Jonathan Koomey, a consulting professor at Stanford who has led several studies on data center energy use, has written that for many enterprises, the cloud “is significantly more energy efficient than using in-house data centers.”

Because we’re committed to sustainability, we sharpened our pencils and looked at our own services to see how they stack up against the alternatives.

We compared Gmail to the traditional enterprise email solutions it’s replaced for more than 4 million businesses. The results were clear: switching to Gmail can be almost 80 times more energy efficient than running in-house email. This is because cloud-based services are typically housed in highly efficient data centers that operate at higher server utilization rates and use hardware and software that’s built specifically for the services they provide—conditions that small businesses are rarely able to create on their own.

An illustration of inefficient server utilization by smaller companies compared to efficient utilization in the cloud.

If you’re more of a romantic than a businessperson, think of it this way: It takes more energy to send a message in a bottle than it does to use Gmail for a year, as long as you count (PDF) the energy used to make the bottle and the wine you drank.

We ran a similar calculation for YouTube and the results are even more striking: the servers needed to play one minute of YouTube consume about 0.0002 kWh of energy. To put that in perspective, it takes about eight seconds for the human body to burn off that same amount. You’d have to watch YouTube for three straight days for our servers to consume the amount of energy required to manufacture, package and ship a single DVD.

In calculating these numbers, we included the energy used by all the Google infrastructure supporting Gmail and YouTube. Of course, your own laptop or phone also consumes energy while you’re accessing Google, so it’s important to choose an efficient model.

There’s still a lot to learn about the global impacts of cloud computing, but one thing we can say with certainty: bit for bit, email for email, and video for video, it’s more efficient in the cloud.

UPDATE | 14 June | 17:50: videos of all the presentations at the Data Centre Summit are now available on our website

Data centres are very important to us—they’re critical to the cloud services we deliver. Over the last 12 years, we’ve put a lot of effort into minimising the amount of energy, water and other resources we use—because it makes financial sense, and because it’s good for the environment too. That work means that today, we use half the energy of a typical industry data centre.

Last week, we brought together more than 150 industry professionals in Zürich, Switzerland for our second conference on data centre efficiency. Since our first conference two years ago in the U.S., the industry’s come a long way, with large operators now very focused on energy efficiency.

With “free cooling” we can dramatically reduce energy consumption by using the local environment to cool servers, instead of energy-intensive chillers. In our data centres we use both air cooling and evaporative cooling—and we revealed the details of the seawater cooling system we’ve custom-engineered for our new data centre in Hamina, Finland.

Google is lucky enough to have the resources and experts to continually improve efficiency. But around 70% of the world’s data centres are operated by companies that probably don’t.

That’s why we shared five simple and low-cost steps that any company, large or small, can use. These include using plastic meat locker curtains to separate hot and cold air, or welding your own air-conditioning chimney out of cheap sheet metal. These techniques are proven to increase energy efficiency, reduce electricity consumption and improve environmental footprint.

We also announced that we’re now participating in the European Commission’s Code of Conduct for Data Centres, a framework for designing and operating data centres efficiently. It ties in closely with the way we build and run our facilities, and has a robust checklist of efficiency best practices that are well worth trying out.

The main take-away was that there is no magic in data centre efficiency. With the right information and a bit of creativity, anyone can make their computing infrastructure efficient. If you operate a data centre or server room, please visit our website and make use of the techniques we’ve outlined. Videos of all the presentations from the Summit will be available on the site next week.

Next week, the Openforum Europe will host a roundtable discussion on openness and portability in the cloud - a topic that features strongly in the ongoing discussions about how to make Europe “cloud-active”. Speakers include:

Professor Guido Scorza - the founder and President of the Institute for the Policies of Innovation. A lawyer and research fellow in the legal aspects of new technology, Scorza is a visiting lecturer at the Universities of Bologna and Rome. He writes extensively and speaks on legal aspects of software and the rights of competitors and consumers.

Brian Fitzpatrick - head of Google's 'Data Liberation Front', a team of engineers who work to make it easy for people to export their data out of Google's services and into the document format of their choice. This allows users to more easily switch between services and providers in the cloud.

Dimitri Tatari - Director General of the Emilia-Romagna Region’s ICT department and a member of the working team involved in OSEPA (Open Source Software Usage by European Public Administrations). Mr Tatari has been involved in the open source software project of the Emilia-Romagna region for the last three years.

The Round Table will be chaired by Graham Taylor, CEO, Openforum Europe and the rapporteur will be Flavia Marzano. As is usual at an OFE roundtable, the Chatham House Rule will apply.

Date: Tuesday 24 May 2011
Time: 18:00-20:30. A light buffet and refreshments will be served.
Where: Google EU office at Chaussée d'Etterbeek 180, right next to Park Leopold. That's just a 5 minute walk from the Parliament and the Commission.
Registration: Please sign up here if you’d like to attend.

We look forward to seeing you there!

Everyone loves a good statistic. And if you’re a policymaker, parliamentary assistant or academic, statistics - vital for evidence-based policy-making - are your bread and butter.

Over the last two years, we’ve made it easier to find, explore and understand more than 27 datasets through search and via clear, colourful visualisations using the Google Public Data Explorer. You can find and analyse over 300 data metrics provided by public institutions such as Eurostat (eg: EU inflation rates), the OECD (labour productivity), the IMF (government debt levels), UNECE (gender balance in parliaments) and many others.

Yesterday, we made the Public Data Explorer even more useful by enabling you to visualise your data. If your organisation produces statistical reports on its key performance indicators, tracks financial or societal trends or conducts large-scale surveys, you can now benefit from the same sort of powerful, animated visualisations that we provide today with our existing datasets.

To make this possible, we’ve developed a new data format that makes visualisation easy, and have provided an interface for anyone to upload their datasets. Once imported, a dataset can be visualized in the Public Data Explorer, embedded in external websites, and shared with others. This does of course require some technical skills, but we’ve built upon existing open standards and a simple user interface to make things as easy as possible.

Our hope is that even more useful statistics can come to life through Public Data Explorer visualisations - and that we can help you realise the value of data in making informed, data-driven decisions.

Game theory in advertising content and pricing; the Panopticon implications of the Internet as our digital memory; and bringing 30 year old guidelines on privacy into the Internet age - these topics and more are addressed by leading academics in the new series of Oxford Internet Institute lectures, hosted by Google Brussels.

Established in 2001 as an academic centre for the study of the societal implications of the Internet, the Institute’s research faculty devotes its time to the study of the economical, social, political and ethical questions shaping the Internet today.

The first lecture of the autumn series was given by Dr Greg Taylor, an economist whose research focuses on the microeconomics of search and other online marketplaces, and the social science implications of commercial interactions. Greg presented his study of the relationship between the information content of online advertisements and the fee structure used to price them, looking at pay per-click, pay per-impression and pay per-scale.

On November 9th, we will welcome Christopher Kuner, Head of the International Privacy and Information Management Practice at law firm Hunton & Williams. Christopher will discuss the Regulation of Transborder Data Flows in the framework of the OECD privacy guidelines and will debate whether the policies that form the basis of today’s privacy and data protection laws are in line with the realities of the Internet age. You can sign up for his lecture here.

On December 8th, Graeme B. Dinwoodie, Professor of Intellectual Property and Information Technology Law and Professorial Fellow of St. Peter’s College Oxford, will discuss keyword advertising and trademark law. You can sign up for his lecture here.

The 2011 calender will kick off on February 8th with a lecture from Christopher Millard, Professor of Privacy and Information Law at the Centre for Commercial Law Studies, and researcher at the QMUL Cloud Legal Project. Christopher will discuss the shift of computing intelligence to the Internet cloud and the key legal and regulatory challenges of controlling and processing data in the cloud. The registration form for his lecture is here.

If you would like to be added to the mailing list for the Oxford Internet Institute lecture series, please contact Tim Davies: tim.davies [at]


It's fair to say that the most popular applications and services that exist today are all to be found in the internet cloud - rather than actually on your computer, as installed applications. Think social networks, email, photosharing, online documents, blogs - and much more of course.

These services are constantly being improved, and new services appear all the time. Users switch services or try out new ones all the time too, perhaps because their friends are using a different service, because there’s better functionality or faster performance elsewhere, or because they just want better service.

So let’s imagine you’ve been using a particular service for a while, and - for whatever reason - you decide to switch to a different provider. A lot of your data is now stored in the service - your photo collection maybe, your status updates, your contacts, your emails and so on. Which raises the question:

How on earth do I get all of my data out of this service and transfer it the new one?

At Google, that’s a question we take very seriously, so seriously, that we have a special team of engineers who spend their time doing nothing else but making sure that it is easy to stop using Google services, and easy to take your data with you, using open standards and formats.

The name of that team is the Data Liberation Front (and yes, for anyone who had spotted the oblique reference, they are Monty Python fans).

On Tuesday 20th of April, Brian Fitzpatrick, the founder of the Data Liberation Front, will be in Brussels to give a Google TechTalk. He’ll explain what "liberating data" actually means, why he thinks it's so important for internet users, for the future of the Internet, and for Europe.

As usual, the TechTalk will take place over lunchtime (there will be food available of course!), at the Google office.

We hope you can make it along. If you’d like to attend, please register here.

When: Tuesday, April 20, 12:15 - 13:45 hours CET (Sandwich lunch provided).
Where: Google Brussels - Chaussée D'Etterbeek 180 - Steenweg op Etterbeek 180, 2nd floor, 1040 Brussels

Brian Fitzpatrick started Google's Chicago engineering office in 2005. An open source contributor for over 10 years, Brian is the engineering manager for several Google products, a member of both the Apache Software Foundation and the Open Web Foundation, a former engineer at Apple and CollabNet, a Subversion developer, a co-author of "Version Control with Subversion", and a resident of Chicago.

Alain Van Gaever
Policy Manager - Google Europe

How should the European Union approach innovation policy over the next five years? That was the fascinating question that I came to Brussels this month to discuss at the Lisbon Council. It’s not an easy question, but it is one that Europe as a whole needs to address, as innovation is vital for competitiveness.

We all talk about innovation, but what is it really? At a high level, it can probably be broken down into three broad categories. Firstly, there's the incremental innovations which improves an existing product or process. In the 70's & 80's it meant moving the tools closer to the workers, and that saved time and improved efficiency. All of which is useful. A more recent example is something like our own Gmail team, which operates Labs to showcase additional features to the core email product, such as Mail Goggles. This allows you to check check that you're really sure you want to send an email you might later regret.

The second order of innovation is similar to the first, but where the incremental innovation has a distinct side effect. Consider our Adsense programme. We’ve developed an system to match advertisements to the content of Publishers' websites. This makes it possible for people not only to tell their story in their local language but also get paid! Instead of a few big publishers being the only ones to benefit, the side effect enables anyone with an Internet connection can participate and has made the Internet an incredible platform for free speech.

Finally, there's the holy grail, and what most people think of as real innovation. Consider our data centre infrastructure. In the past data centres were built with expensive, ultra-reliable hardware. In contrast, most of our data centres run on cheap hardware; in fact, cheaper than the average game console. We’ve designed software that can recover when these machines fail, making the consumer experience just as reliable. We published a paper a while ago on costs compared to guaranteed reliable hardware, and claimed that our approach generated a 10x price per price increase on cost per Mb vs traditional approaches, a game changing number.

So what does this mean for policy makers? I shared the platform with Anthony D. Williams who presented his paper on Wikinomics and the Era of Openness. Much of the subsequent discussion of his emphasis on collabaratory innovation focused upon the consequences for intellectual property regulation.

Innovation Commissioner Máire Geoghegan-Quinn gave the keynote speech. She outlined an agenda that rightly embraced a broad definition of innovation. In addition to innovation by lab workers in 'white coats', she recognised that real innovation is possible outside of the R&D laboratory in all sort of industries.

For my part, I offered the following ideas for innovation policy:
  • Put the consumer first: We have a core product philosophy within Google that goes like this: “Follow the user and all else will follow”. It turns out not to be a new idea, as Henry Ford learnt when he famously said: “If I asked my customers what they wanted, they’d have asked me to build a faster horse”. The key is understanding user behavior and building and iterating products by ‘following the data’ on aggregated usage. (The Economist recently provided a excellent introduction to the power of data.)
  • Remove barriers to innovation: Speed matters. Consider Playfish, the company behind the super popular Facebook App Animal farm. They started 2 years ago in UK, and built their entire service on Amazon’s ‘infrastructure as a service’ platform (that enables business to rent data centre capacity rather than build it themselves). They recently sold to Electronic Arts for $275m. We sometimes talk of a new ‘innovation without permission’ culture, and facilitating that would be a good goal to set for legislators.
  • Diversity in the hiring process matters, as without multiple perspectives innovation will die because everyone sees the problem the same way. Europe’s universities must educate a broad cross-section of students in maths, science and computer science. In particular more women need to be encouraged to take careers in technology.
  • Risk taking is as important for large companies as for start-ups: We encourage employees to take risks, and we tailor our compensation model to encourage risk taking. Otherwise people quickly learn that to get ahead they shouldn’t try anything new, especially if there is a risk of failure. The result: innovation dies.
We need a broad culture that truly embraces innovation. This may sound obvious, but innovation brings disruption that quickly mutes the enthusiasm. Embracing disruption is perhaps the new Commissioner's real political challenge.

Posted by Rian Liebenberg, Engineering Director

The European Commission today announced an important decision designed to inject more competition into the crucial market for Internet browsers. Under the Commission decision, more than 100 million Europeans will soon receive an opportunity to download a new browser. On both new and old computers that run Internet Explorer, a ballot screen will pop up on their computers displaying icons of the major browser makers and allowing them to choose among them with a simple click.

Browsers are critical to the Internet; they enable us to surf the web, search, chat, email, watch videos, or connect to our social networks. The game changing apps of the past few years -- Facebook, Twitter, YouTube, and others -- have all been built online. Most of the modern computing experience already happens inside the browser and many of the remaining computing tasks inside the PC soon could be done more efficiently and elegantly online in the "cloud."

Most people already spend more time using their browsers than they do in their cars. If unleashed, we believe PC browsers could allow an exponential impact on Internet innovation. That's why we launched our own browser Chrome and why we are soon launching a full-fledged browser operating system called Chrome OS, which will further increase the speed, simplicity and security of online computing.

In coming months, millions of Europeans will have an opportunity to learn more about the importance of browsers. We plan to continue educating consumers by participating in initiatives such as the site and opening our own dialogue with users, confident that more competition in the browser space will mean a better user experience for people everywhere.

Posted By Sundar Pichai, Vice President Product Management

Cross-posted from Google Policy Blog

Imagine you want to move out of your apartment. When you ask your landlord about the terms of your previous lease, he says that you are free to leave at any time; however, you cannot take all of your things with you - not your photos, your keepsakes, or your clothing. If you're like most people, a restriction like this may cause you to rethink moving altogether. Not only is this a bad situation for you as the tenant, but it's also detrimental to the housing industry as a whole, which no longer has incentive to build better apartments at all.

Although this may seem like a strange analogy, this pretty accurately describes the situation my team, Google's Data Liberation Front, is working hard to combat from an engineering perspective. We're a small team of Google Chicago engineers (named after a Monty Python skit about the Judean People's Front) that aims to make it easy for our users to transfer their personal data in and out of Google's services by building simple import and export functions. Our goal is to "liberate" data so that consumers and businesses using Google products always have a choice when it comes to the technology they use.

What does product liberation look like? Said simply, a liberated product is one which has built-in features that make it easy (and free) to remove your data from the product in the event that you'd like to take it elsewhere.

At the heart of this lies our strong commitment to an open web run on open standards. We think open is better than closed -- not because closed is inherently bad, but because when it's easy for users to leave your product, there's a sense of urgency to improve and innovate in order to keep your users. When your users are locked in, there's a strong temptation to be complacent and focus less on making your product better.

Many web services make it difficult to leave their services - you have to pay them for exporting your data, or jump through all sorts of technical hoops -- for example, exporting your photos one by one, versus all at once. We believe that users - not products - own their data, and should be able to quickly and easily take that data out of any product without a hassle. We'd rather have loyal users who use Google products because they're innovative - not because they lock users in. You can think of this as a long-term strategy to retain loyal users, rather than the short-term strategy of making it hard for people to leave.

We've already liberated over half of all Google products, from our popular blogging platform Blogger, to our email service Gmail, and Google developer tools including App Engine. In the upcoming months, we also plan to liberate Google Sites and Google Docs (batch-export).

Feel free to take a deeper look into product liberation at, a website we're launching today which is dedicated to explaining the Data Liberation Front and the products we've liberated.

If you'd like to contribute suggestions for services that you think need to be liberated, please do so on our Data Liberation Moderator page. We're also on Twitter @dataliberation.


When I recently traveled to Brussels to speak about Google’s vision for computing, I was pleased to learn how many of our ideas are shared at the European Commission. Both of us are betting on "cloud" computing, where we no longer are doing most of our computing on the desktop, but on the net, through webmail, blogs, posting photos and searching for information.

Information Society Commissioner Viviane Reding has recognized this phenomenon - and the fantastic prospects it offers for Europe. In a speech about a year and a half ago, she noted that she believes the "European software industry" will be able "to ride the rising wave of on-line software." Cloud computing, she added, "will place a new emphasis on open and interoperable systems that can be upgraded and joined together in networks" and "see a shift towards open standards and indeed open source software."

Since the Commissioner's encouraging statements, much progress have been made. In the past, the best technology was launched in the workplace. Now, the best technology starts with consumers. Only a year ago, the costs of an internal video service were prohibitive. No longer. One of my teams is currently working on the next generation of video instant messaging that brings video-conferencing within the realm of any business. Here's a full copy of the presentation I made to European parliamentarians.

The cloud will enable companies to save costs, particularly small businesses. In the past, businesses needed to make big investment in computers and software for accounting systems, customer management systems, email servers, maybe even phone or video conferencing systems. Today, all of those services are available via the network cloud, and you pay for it only as you use it. Sophisticated computer systems, previously the realm of larger companies, suddenly are available to all.

When we at Google and other web-born companies such as or Amazon built our robust platforms for our own services, we started to rent access to our data centres to other companies. Data centres involve huge fixed cost investments, but we're offering server capacity that you can scale as your business succeeds.

I concluded with the message at how this move to online computing promises a radical transformation crucial to Europe’s future competitiveness. My audience at the European Parliament seemed to understand, particularly when I mentioned how Barack Obama used the net to propel forward his presidential campaign. European parliament elections are scheduled for June and many parliamentarians asked how they too could benefit from moving their campaigns into the cloud.

Posted by Rian Liebenberg, Engineering Director