Tag Archives: Google

FBI director calls tech giants’ stance on strong encryption ‘depressing’


FBI Director James Comey told an audience he thinks the government should have a back door to gain access to secure devices. (Holly LaFon/MEDILL NSJI)


WASHINGTON — FBI Director James Comey on Wednesday criticized tech giants including Apple and Google for opposing so-called “back doors” in security software for government agencies to access encrypted phones, computers, and other devices.

The tech companies along with academic experts and advocacy groups wrote a letter to President Obama on Tuesday opposing statements by administration officials who have come out strongly against more robust encryption on consumer products. In fact, some officials have advocated that tech companies stop selling encrypted products altogether unless the government has a way to decrypt the data.

The letter makes the case that weakening products’ security would only make them more vulnerable to “innumerable criminal and national security threats.”

But Mr. Comey, addressing the Cybersecurity Law Institute at Georgetown University, said the FBI faces increasing difficulty in unlocking encrypted devices – and those who signed the letter were either not being fair-minded or were failing to see the societal costs to universal strong encryption.

“Either one of those things is depressing to me,” he said.

Citizens’ privacy interests and public safety are coming closer to “a full-on collision,” he said. Acknowledging “tremendous societal benefits” to encryption, Comey said the inability of law enforcement officials to gain access to encrypted devices when they have probable cause and strong oversight threatens public safety.

“As all of our lives become digital, the logic of encryption is all of our lives will be covered by strong encryption,” he said. “Therefore all of our lives … including the lives of criminals and terrorists and spies will be in a place that is utterly unavailable to court-ordered process. And that to a democracy should be utterly concerning.”

However, tech companies and encryption advocates argue in the letter that creating back doors would also pose an economic threat to the companies, especially in light of the Edward Snowden leaks.

“US companies are already struggling to maintain international trust in the wake of revelations about the National Security Agency’s surveillance programs. Introducing mandatory vulnerabilities into American products would further push many customers – be they domestic or international, individual or institutional – to turn away from those compromised products and services,” the letter said.

What’s more, critics – including many lawmakers – who oppose efforts to weaken encryption say that creating a system in which government agencies have access to secure data would also create vulnerabilities exploitable by criminal hackers and other governments.

Comey acknowledged the business pressures and competitive issues involved, but urged tech companies to find a safe way to cooperate with government needs to access information.

“Smart people, reasonable people will disagree mightily, technical people will say it’s too hard,” he said. “My reaction to that is, ‘Really? Too hard? Too hard for the people that we have in this country to figure something out?’ I’m not that pessimistic.”

Published in conjunction with Arkansas Democrat-Gazette Logo

Minimizing your digital trail

WASHINGTON — In popular culture, going “off the grid” is generally portrayed as either unsustainable or isolated: a protagonist angers some omniscient corporate or government agency and has to hole up in a remote cabin in the woods until he can clear his name or an anti-government extremist sets up camp, also in the middle of nowhere, living off the land, utterly cut off from society at large.

But is there a way to live normally while also living less visibly on the grid? What steps can you take to reduce your digital footprint that don’t overly restrict your movements?

What is a digital footprint?

Your digital footprint is the data you leave behind when you use a digital service—browse the web, swipe a rewards card, post on social media. Your digital footprint is usually one of two classifications: active or passive.

Your active digital footprint is any information you willingly give out about yourself, from the posts you put up on Facebook to the location information you give to your local mass transit system when you swipe your transit pass.

By contrast, your passive digital footprint is information that’s being collected about you without your express knowledge or authorization, for example, the “cookies” and “hits” saved when you visit a website. When you see personalized ads on Google, for example, those are tailored to you through collection of your personal preferences as inferred through collection of your passive digital footprint.

To assess my digital footprint, I looked through my wallet, my computer and my phone.

The footprint in your wallet

First, the wallet: I have several rewards cards, each representing a company that has a record of me in its database that shows how often I shop and what I buy, which is linked to my name, address, email and birthday—plus a security question in case I forget my password, usually my mother’s middle name.

While I would consider this information fairly benign—they don’t have my credit card information or my Social Security number—these companies can still make many inferences about me from my purchases. CVS, for example, could probably say fairly accurately if I’m sick based on my purchase of medications, whether I’m sexually active based on birth control purchases and any medical conditions I may have based on my prescription purchases.

If I wanted to minimize my digital footprint, I could terminate all my rewards accounts and refrain from opening any more. For me, though, it’s worth allowing these companies to collect my information in order to receive the deals, coupons and specials afforded me as a rewards member.

Next up is my transit pass, which is linked to my name, local address and debit card. The transit authority has a record of every time I swipe my way onto a city bus or train, a record of my movements linked to my name.

A minimal-footprint alternative to a transit pass is single-use fare cards. If purchased with cash, they would leave no record of my travels linked to my name. While this, like the rewards cards, is feasible, it’s far less convenient than the pass —so much less so that again I’m willing to compromise my privacy.

My debit card and insurance card are the two highest-value sources of personal information, but both are utterly necessary—living half a country away from my local credit union, I need my debit card to complete necessary transactions. My medical insurance card, relatively useless to identity thieves unless they have an ID with my name on it, does represent another large file in a database with my personal information—doctors’ visits, prescriptions and hospital stays for the past several years. People with just the physical card, not my license or information, can’t do much with that, but if a hacker gets to that information it could be very damaging.

No driver’s license? No credit card?

To minimize my digital footprint, then, I could pare down my wallet to just the absolute necessities—my insurance card, debit card and my license. You didn’t talk about your license

Computer footprint

If I’m guilty of leaving a large digital footprint, all my worst infractions probably happen across the Web.

Between Facebook, Twitter and Pinterest, I’ve broadcast my name, picture, email, hometown and general movements, if not my specific location, on each of those sites. Of the three, Facebook certainly has the most comprehensive picture of my life for the past seven years—where I’ve been, with whom, what I like and what I’m thinking.

If I wanted to take myself as far off the grid as feasible, simply deactivating the accounts wouldn’t work—Facebook keeps all your information there for you to pick up where you left off. You can permanently delete it with no option for recovery, but some information isn’t stored just on your account—messages exchanged with friends, for example, or any information shared with third-party apps.

If you keep using social networking sites, privacy policies change frequently, meaning that even if you choose the most restrictive privacy settings, you often have to go back and re-set them whenever the company changes its policy. Apps complicate things even further, farming out much of your information to third-party companies with different privacy policies.

Even if you’re vigilant about your privacy settings and eschew apps, your profile is only as private as your most public Facebook friend, said Paul Rosenzweig, a privacy and homeland security expert.

When shopping online, it’s important to check the privacy statements and security policies of the companies you’re using. If possible, purchase gift cards to the specific retailer or from credit card companies and use those to shop, so you don’t leave your credit card information vulnerable to breaches like that of Target.

I know that email is not my friend when it comes to online privacy, but I can’t operate without it.  I use Gmail on Google Chrome for my email, so I installed Mymail-Crypt. It’s one of several “pretty good protection,” or PGP, encryption programs. Using it, my messages appear to be a jumbled bunch of letters until the recipient decrypts it using their private key, which I can save to a key server, like the aptly named Keyserver, where it’s searchable by my email or key ID. I can then link to it on my personal profiles such as Facebook or LinkedIn. People can then send an encrypted email to me using my public key that cannot be read without my private key to unlock it. I’ve also started encrypting my G-Chats using Off the Record chat.

Email can be used against you. Phishers have started to send more sophisticated emails imitating individuals or companies you trust in order to convince you to give up information like your social security number or credit card data. Drew Mitnick a junior policy counselor at digital rights advocacy group Access Now, said you need to be vigilant no matter what you’re doing on the internet.

“Ensure that whoever you’re dealing with is asking for appropriate information within the scope of the service,” he said. In other words, Gap shouldn’t be asking for your Social Security number.

To limit cookies and other data collection during your Internet use, you can open incognito windows in Google Chrome. In incognito mode, the pages you view don’t stay in your browser or search histories or your cookie store—though your Internet service provider and the sites you visit still have a record of your browsing.

Finally, encrypt your hard drive. Privacy laws vary from state to state and country to country so the best way to ensure that you’re protected no matter where you are is to encrypt your computer and be careful not leave it where someone can mess with it, said Mitnick.

Phone footprint

Another source of vulnerability for many people is a smartphone. As long as you have a phone, you’re on the grid—phone companies can triangulate your position using cell phone towers and location services, and they log your calls. Beyond that, though, there are steps you can take to limit information people can access about you using your phone.

First, be judicious when installing apps. Carefully read the permissions an app requires for installation, and if you’re uncomfortable with them, don’t install it! Read privacy policies and terms of use so you know what data the app keeps on you.

Because I have a Windows phone, many of the basic apps (alarms, maps, Internet Explorer, music, and Microsoft Office) are Microsoft apps and use their terms of use and privacy policy, which is pretty good about not sharing my information with third parties. They also delete your account data after you delete their app, though it may take a few weeks.

I have several social apps, such as the aforementioned Facebook and Pinterest, for which the privacy settings are fairly similar to their desktop counterparts—not very private—with the added bonus of them now having access to my location and phone number. It’s entirely possible—and advisable, if you’re trying to leave a minimal footprint—to live without these apps, but I choose not to.

I’m selective about the apps I install on my phone. Aside from the apps that come with the phone and my social media apps, I only have Uber—and that has a lot of access to my phone. According to the app information, Uber can access my contacts, phone identity, location, maps, microphone, data services, phone dialer, speech and web browser. That’s a lot, and not all of it seems necessary—why does Uber need my contacts? Again, though, I chose to compromise my privacy on this one because the convenience, for me, outweighed the risk.

A precaution I’ve always taken is turning off my location service unless I need it. While my cell phone company can still track me, this prevents my apps from accessing my location. I don’t need Pinterest or Facebook to know where I am to get what I want out of the app, so I don’t provide that information to them.

One of the projects Access Now has been working on is “super cookies”—when you use your cell phone, the cell companies can attach unique identifiers to your browsing as you go across multiple sites. Many companies don’t even offer opt-outs. AT&T has now stopped using super cookies, but other companies still do so.

If you don’t already, use two-step verification whenever possible to ensure that no one but you is logging onto your accounts. This process, used by Gmail, has you enter your password and a one-time numerical code texted to a phone number you provide.

Set a passcode to your phone if you haven’t already, and make it something people couldn’t easily guess—don’t use your birthday, for example. I’ve started using random numbers and passwords generated for long-defunct accounts like my middle school computer login that I memorized years ago but that can’t be linked back to me.

Amie Stepanovich of Access Now suggested using four unrelated words strung together for online account passwords—they’re even harder to hack than the usual suggestions of capital and lowercase letters, symbols and numbers.

One final precaution you can take is to encrypt your device. Apple has already started encrypting its phones by default, and Google has promised to do so. Regardless, you can turn on encryption yourself. I have a Windows phone, which does not allow for easy encryption—in fact, I can’t encrypt my SD card at all. To encrypt my phone, I need to log in to Office 365 on my laptop and change my mobile device mailbox policies to require a password, encryption, and an automatic wipe after a number of passcode fails I choose. I then log into Office 365 on my phone to sync the new settings. It’s much more straightforward for an Android—just go to settings, security, and choose “Encrypt phone.”

Off the grid? Not even close

For me – and most people, it’s not feasible to live entirely off the grid. Between my debit card, various online accounts and smartphone, I pour my personal data into company and government databases every day. The trick is to live on the grid intelligently, only providing the information that is necessary and taking steps to protect your devices from unauthorized access.

As Europe’s privacy laws evolve, so must American companies when operating ‘across the pond’

WASHINGTON — In 1998, a Spanish newspaper announced that a man named Mario Costeja González had his home repossessed.

A decade later, González Googled his name and found that the incident came up in search engine results. Incensed, he complained to Google, asking that information related to him be erased because he thought it was no longer relevant.

Google refused and the dispute ended up in court. In 2014, the Court of Justice of the European Union ruled in favor of González.

The ruling may seem like an affront to free speech, but the court’s decision reflected the region’s long-running commitment to privacy protection.

With the global nature of Internet commerce, Google will not be the only American company ensnared by European data protection laws. Many other firms may find themselves – sometimes unwittingly – intruding on European privacy laws, and they are spending more money and effort into coping with this digital clash of cultures.

More than an ocean apart

Citizens in the U.S. and Europe value privacy. But they articulate it differently in legal terms.

Every European citizen has the “right to respect for his private and family life, his home and his correspondence,” according to the 1953 European Convention on Human Rights – and the most significant legislation by the European Union in recent years is a 1995 directive, which outlines core principles its members should observe.

The directive says that governments, institutions and companies should inform citizens of what information is being collected, ensure data is not disclosed to other parties without the individuals’ consent and allow them to access and correct to data about them.

The directive has formed the backbone of many European countries’ national privacy laws protecting citizens against intrusions by government and by companies, said Viktor Mayer-Schönberger, an Internet governance and regulation professor at England’s University of Oxford.

One component of the European Union directive states that personal data can be processed only with unambiguous consent given by the subject, among other requirements.

The EU’s Court of Justice ruled in favor of González last year for precisely this reason: Since individuals must give permission for the search engine to handle their data, the companies have to handle requests that their information be taken down.

Privacy law is articulated very differently on the U.S. side of the Atlantic. It is not explicitly guaranteed in the Constitution and only suggested by the Fourth Amendment’s requirement for a warrant for the government to search a citizen’s home.

“What the U.S. lacks is an omnibus privacy laws that binds not just the public sector but the private sector as well,” said Mayer-Schönberger. “But the U.S. does have a number of sectoral privacy laws that also apply to the private sector, such as in the context of health data.”

In other words, “privacy in relation to private companies is seen as a species of commercial regulation,” said Bill McGeveran, an information law professor at the University of Minnesota.

The implication of this is enormous for companies wishing to collect and process information about their consumers.

“In Europe, you can only do so if the law says you specifically can, but in the U.S. you can collect data about anyone, anytime, unless there’s a law that prohibits it,” said McGeveran.

“Data is a resource in Europe and the U.S. but in Europe, it’s something in the ground and you need to ask permission before you can mine it.”

Why Europe and America differ on privacy issues

People in the U.S. want privacy just as much as people in Europe, Mayer- Schönberger emphasized. But there is no single easy answer for why data protection legislation is more clearly laid out in Europe.

Europe’s tangled history with data privacy could be a reason – the Nazis used personal data to target marginalized communities during the Holocaust, and in the 1980s, privacy advocates in Germany protested against a census in West Germany that asked questions they deemed too invasive.

“As Germany has always been a key power broker in the EU, that spilled over into the European debate,” he said.

Fred Cate, a law professor at Indiana University, also said the economic reliance of the U.S. economy on technology is also an important reason.

“The U.S. is huge on data innovation – privacy is important, but so is economic success,” he said. “There isn’t a European search engine that can compete with Bing and Google, and so fewer European companies are using privacy as a competitive tool.”

What this means for American companies

For American companies, complying with European privacy laws is a complex process because the level of enforcement varies from country to country. McGeveran said that while privacy regulators in England and Ireland tend to be more cooperative, Spain and Germany are tougher on firms, slapping violators with fines.

Firms may have to jump through additional legal hurdles to do something like moving internal company data, such as payroll information, out of Europe to the U.S.

The clashing regulations could put companies in a legal quandary.

Cate cited the example of a company that was required by a U.S. court to produce certain data that the German government prohibited it from obtaining. “You’re stuck between a rock and a hard place,” he said. “Whose law do you choose to violate?”

American companies therefore must plan carefully when operating into Europe, especially with the ever-changing Internet landscape and the privacy concerns it has raised.

“They cannot assume that their structure and business model in the U.S. can be duplicated in Europe without any modifications,” McGeveran warned.

The 2014 European decision about Google highlighted this challenge starkly. Search engines engines have scrambled to cope with the new development in European privacy law.

Google and Yahoo have set up online forms for users to submit removal requests.

To date, Google has approved 286,814 – or 40 percent – of the removal requests they have received, after judging whether the results were outdated, inaccurate, inadequate, excessive or of interest to the public.

Yahoo has set up a similar intake form as well as a task force to figure out how to process the removal requests, said Laura Juanes Micas, a senior legal director of international privacy at the search engine company.

“This situation was about a particular case in Spain and it has been challenging to create general rules for all removal requests from this one case,” she said.

She added that the ruling placed the burden on search engines to figure out how to balance the rights of the individual to privacy and a third party’s right to freely express himselves on the Internet – but this was hard for private companies whose main duty is to make profit and serve its customers.

American search engines are not erasing search results in non-EU web domains for now, meaning that the information would still be viewable in the U.S. version of Google, for instance. However, European regulators are pushing them to apply the ruling for all web domains, said Lucio Scudiero, a privacy legal counsel in Italy and fellow at the non-profit think-tank Italian Institute of Privacy, said.

“I expect this issue to end up in courts on both sides of the pond soon,” he warned.

Supreme Court decision leaves unanswered questions on GPS tracking

(Mike Renlund/Flickr)

WASHINGTON — As Antoine Jones drove his Jeep Grand Cherokee around the Washington area  in the fall of 2005, he was simply going about his daily routine.  But unfortunately for Jones, whose daily routine involved frequenting a drug stash house in Maryland filled with $850,000 and 97 kilograms of cocaine, the U.S. government was watching.

Thanks to a global positioning system covertly placed in the underbelly of Jones’ car, the government was able to track and record the Jeep’s every move.  But Jones challenged the legality of evidence, saying the GPS had not been installed within the time frame or physical jurisdiction outlined by the court in issuing a search warrant. The government argued that the GPS placement didn’t actually constitute a search under the Fourth Amendment so the fact that police had not followed the warrant guidelines was irrelevant.

In what many viewed as a strong victory for privacy rights, the Supreme Court unanimously ruled that the attachment of the device was  a search under the Fourth Amendment, thus requiring a warrant.  But while the opinion authored by Justice Antonin Scalia answered the specific question in regards to a “physical search,” it was mum on the broader implications of the ruling.

“[The case] simply left for another day whether monitoring a device that had been preinstalled or otherwise gathering a large quantum of data on somebody would also raise a Fourth Amendment issue,” said David Gray, an associate law professor at the University of Maryland’s Carey School of Law.  “That was the ground that the four-justice concurring opinion by Justice [Samuel] Alito was ready to reach, but the narrower ground identified by the Scalia majority didn’t need to get there, so it didn’t.”

This narrow ruling was not unusual, Gray explained.  Courts usually try to “reach the narrowest grounds for a decision” and, because the court did not believe that the larger issue was adequately presented, Gray believes it would have been “irresponsible” to extend the decision more broadly.

A whole new level of technology

The Jones decision was built off of the Fourth Amendment, which protects “the right of the people to be secure in their persons, houses, papers, and effects against unreasonable searches and seizures… [unless] upon probable cause, supported by oath or affirmation.”  Over time, the amendment has been understood to assert the necessity of a search warrant before law enforcement can begin a search of people or property.

While the U.S. government did concede that officers had violated the terms of the warrant, the lawyers argued that GPS tracking did not require a warrant, citing previous cases that ruled placing a homing beacon on a car did not require a warrant.  However, the defense asserted that GPS technology was exponentially more intrusive than the homing beacons, which essentially allowed police to track the beacon only when they were within its line of sight.

“This is an exceptional form of technology in terms of what resources have been available to law enforcement in the past,” said Kendall Burman, a senior national security fellow at the Center for Democracy and Technology.  “They are able to track individuals and cars in this instant without the use of human beings.”

The third party doctrine

Because Scalia’s ruling stated that the “government physically occupied private property, questions continue to arise in regards to “nonintrusive” searches.

The Supreme Court’s third party doctrine outlined in United States v. Miller explains that citizens cannot expect privacy protection under the Fourth Amendment over information they disclose to a third party.  When coupled with the growing amount of location information collected by private companies, this doctrine allows companies to use this information however they see fit.

Graphic by Ben Kamisar

John Villasenor, a senior fellow in the Center for Technology Innovation at the Brookings Institution, said that as private companies continue to amass mountains of information on the general public, location tracking without a “physical search” that would require a warrant under the U.S. v. Jones is already becoming less relevant

“Technology has changed so much that a lot of us have our locations tracked anyway without a warrant, so the issue of before-the-fact warrants will, in many cases, be less important than it was even when the events that led to Jones started,” he said.  “…The location data to track you and me and almost everyone else is already stored somewhere.  The question is, [who can] go and get it.”

As of March, a Pew Internet report found that 46 percent of American adults use a smart phone.  These devices, which mostly run on operating systems created by Apple or Google, collect location data which is aggregated and stored by the company.

Justice Sonia Sotomayor addressed the issue of the third party doctrine in her concurrence, where she mentioned the possibility of reviewing the doctrine.   Burman said that she was “heartened” to see Sotomayor question this doctrine and hopes that the court will address situations where people are not intending to lift the “veil of privacy” from their activities.

“I think the concurrence really draws that doctrine into question,” she said.  “The strength of [Justice] Sotomayor’s concurrence along with [Justice] Alito’s suggests that there is a real opportunity to reevaluate what the third party doctrine means.”

But while Gray understands the need to re-evaluate the doctrine, he believes that the current doctrine “reflects a pre-existing assessment [of] the proper balancing of interests under the Fourth Amendment” between private rights and the ability of law enforcement to perform their duties.  In his view, there are many legitimate circumstances in which law enforcement should be able to work with private companies. As a hypothetical example, he cited  a social network company turning evidence of criminal activity to the police on its own accord.

“If you had a broad rule that any information that was detected and aggregated by a private company could not be shared with government without violating the Fourth Amendment, then you would essentially be building this artificial wall that would dramatically limit the ability for law enforcement to get involved in circumstances we would like them to get involved,” he said.  “It’s going will be hard to make the case that building an artificial wall best serves the proper balance.”

Tracked by cyber-footprints

Every time we swipe in at the train station, make a purchase at the grocery store or walk by a bank, some element of our personal data is collected. You could be alone on a backpacking trip, making no contact with the outside world, and that smartphone turned off in your pocket still would use global positioning systems to track your every move.

We are living in a time of information overload, and sometimes it becomes difficult to remember where we have left cyber-footprints.

For 48 hours, I tracked exactly who was taking my information digitally and how it could be used. During this time, I spent one day in Washington  and the next day traveling by air to Chicago.

[swfobj src=”http://nationalsecurityzone.medill.northwestern.edu/wp-content/uploads/2012/03/ellenfinalfinal1.swf” width=”650″ height=”700″ align=”center”]

Google standing by hotly contested change in privacy policy

WASHINGTON — Google is maintaining that a privacy policy implemented Thursday is not the dangerous change civil liberties experts are claiming it could become.

The new approach combines the privacy policies of more than 60 Google products into a uniform code that emphasizes what the search giant considers a “more intuitive user experience.”

In an official Google blog post Thursday, Alma Whitten, the company’s director of privacy, product and engineering, wrote that the policy adjustment makes Google’s privacy controls easier to understand. Beyond that, nothing has been drastically modified, she said in the blog post.

“The new policy doesn’t change any existing privacy settings or how any personal information is shared outside of Google,” Whitten wrote. “We aren’t colleting any new or additional information about users. We won’t be selling your personal data. And we will continue to employ industry-leading security to keep your information safe.”

The company has contended a more universal policy will work to its users’ advantage in the long run. For example, under the new privacy policy, one Google product could generate traffic conditions if another Google product pinpoints the user in a certain geographic location.

Since the altered privacy policy was disclosed earlier this year, it has touched off a wave of international criticism from everyone from civil liberties watchdogs to elected officials.

In late February, 36 attorneys general signed an open letter dinging Google for not allowing users to opt out of the new privacy policy. The message, addressed to Google CEO Larry Page, addes that the privacy shift allows a user’s personal information to be shared across multiple services even if the user signs up on only one service.

The privacy policy revamping basically results in personal data being “held hostage in the Google ecosystem,” the members of the National Association of Attorneys General said in the letter.

The association’s missive came several days after the Electronic Privacy Information Center sued the Federal Trade Commission as a way of persuading it to curb Google’s impending policy change.

And on Thursday, European Union Justice Commissioner Viviane Reding declared the consolidated privacy policy goes against European law. She told the BBC that the search giant is not following transparency rules as it collects personal information across Google’s dozens of platforms, including YouTube and Blogger.

Google has greeted each challenge with the same defense: Its new, unified privacy policy follows all applicable laws and makes using its services easier for all users.

The company told a reporter for The Washington Post’s Post Tech blog that it remains “happy to discuss this approach with regulators globally.”

Thursday’s Google blog post confirmed the company’s confidence in its privacy policy revision.

“As you use our products one thing will be clear: It’s the same Google experience that you’re used to, with the same controls,” Whitten wrote.


Online Privacy: Is it even possible in today's networked world?

WASHINGTON–On July 4th, 1776, the founders of our country adopted the Declaration of Independence, and forever altered the course of history. But at heart of that document is one line that stands out above all others: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable rights, that among these are life, liberty and the pursuit of happiness.”

Life, liberty and the pursuit of happiness: three ideas, three unalienable rights that have come to define our country and our country’s mindset. But there’s another idea that is thought to be in line with those: privacy. The Fourth Amendment to the Constitution, part of the Bill of Rights, guards against unreasonable searches and seizures. But is privacy a right, or is it just assumed to be a right? In a modern world where Facebook and targeted ad campaigns based on ¬¬internet surfing patterns reign supreme, can we even assume that our information is being kept private and safe?

In the wake of recent congressional hearings on online privacy, major players such as Facebook, Apple and Google were questioned on that very topic: Is their consumers’  information safe and private?

At the hearing, Facebook chief technology officer Bret Taylor assured Senate leaders that they “never sell data to third parties or advertisers” and that “in every aspect of a product’s design, privacy is an aspect of the discussion.”

However, one day after these hearings, multiple media outlets reported that a hacker had compiled information from 100 million Facebook users—including email addresses, individual websites, and phone numbers—and made all of this information available for download.

This flies in the face of exactly what Taylor said, that such information is private and not  available to hackers. Facebook will counter with an argument centering on user privacy controls, but does the company believe that everyone who uses their product is aware of these controls?

In a recent E-Business and ForeSee Results customer satisfaction index report, Facebook scored in the lowest five percent of private sector companies.

“Our research shows that privacy concerns, frequent changes to the website, and commercialization and advertising adversely affect the consumer experience,” said Larry Freed, president and CEO of ForeSee Results, in a press release.

Google, meanwhile, has faced similar problems concerning privacy. More than two months ago, Google admitted it collected date on users of its Google Maps Street View program. And in a move that will surely raise some eyebrows, Examiner.com reported Monday that a German company recently sold GPS-controlled surveillance drone cameras to Google. The reported purchase of these drones is that they will be used with other mapping projects.

In a world of increasing surveillance and by default, less privacy, is there a reasonable right to expect privacy?

According to the Wall Street Journal, in 2008, Microsoft had plans to unveil its Internet Explorer 8 with a “privacy by default” setting, as opposed to Facebook’s opt-in privacy mantra. But Microsoft’s plan was quickly scrapped in favor of a track-and-sell targeted ad program aimed at its users. The reported reasoning for such a change: “Executives who argued that giving automatic privacy to consumers would make it tougher for Microsoft to profit from selling online ads.”

So the question becomes: If the companies in charge of so much of our so-called “private” information have no incentive to protect what we do online, should demand more control over our privacy?