civil liberties – Medill National Security Zone http://nationalsecurityzone.medill.northwestern.edu A resource for covering national security issues Tue, 15 Mar 2016 22:20:28 +0000 en-US hourly 1 Data collection brings more benefits than loss, experts say http://nationalsecurityzone.medill.northwestern.edu/blog/2015/08/11/data-collection-brings-more-benefits-than-loss-experts-say/ Tue, 11 Aug 2015 14:51:55 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=22886 Continue reading ]]> WASHINGTON – You’re probably one of the 91 percent of American adults who think they’ve lost control over how their personal information is collected and used by companies (according to a Pew Research study in early 2015). But big data collection brings benefits that outweigh the potential downsides, contended Ben Wittes, a senior fellow at the Brookings Institution, in a panel discussion at the Capital Visitor Center last Thursday.

Consumers’ concern about online privacy are at all-time high due to the emerging technologies – for instance, e-commerce and mobile devices– which collects a big chunk of consumer data, the Pew Research study says.

However, people who worry about “privacy eroding into the river and being gone forever,” added Wittes, ignore how those benefits actually increase privacy.

The rise of online sales has meant you can mail-order products that might be too embarrassing to buy in person, Wittes added. “Without looking at somebody in the eye, without confessing the interest in this subject, you get what you want.”

Because all e-books look the same on an e-reader, for instance, you can read Fifty Shades of Grey on your Kindle without shame—which may explain why the e-version of this book has outsold its printed version.

The value of the privacy of those purchases, Wittes argued, outweighs the value of the data given for them—like email, credit card numbers, browsing history, personal preferences, and location-based information.

Wittes suggested changing vocabulary that consumers use to describe the benefits they get with giving up some personal information. It’s not only “convenience,” he said, “it’s also privacy benefits.”

Joshua New, policy analyst at the Information Technology Innovation Foundation, said data collection also brings economic benefits to consumers.

He cited car insurance as an example. Instead of deciding your insurance premium based on broad factors – for instance, age, gender, neighborhood, drivers could use data to prove that they are cautious and don’t brake rapidly to get lower premiums even they are in the “high-risk section” based on traditional measurements, New said.

People who strive for online privacy should be aware that there is a cost to it. Adam Thierer, a senior research fellow at George Mason University, said it’s not impossible for people to protect their privacy if they don’t mind losing the benefits of giving up their data.

“Companies can offer paid options where user information won’t be collected,’ Thierer said. “But at the moment, I don’t think many people will pay for their privacy.”

A balance between consumer privacy and technology innovation is what the Federal Trade Commission is pursuing. Totally prohibiting data collection, which will create barriers for breakthrough innovations, is definitely not the solution.

“We should definitely limit the use of data,” said Federal Trade Commission member Maureen Ohlhausen, “but not limit the collection of data.”


Published in conjunction with PC World Logo

]]>
Florida postal worker who landed gyrocopter on Capitol Lawn pleads not guilty http://nationalsecurityzone.medill.northwestern.edu/blog/2015/05/26/florida-postal-worker-who-landed-on-capitol-lawn-pleads-not-guilty/ Tue, 26 May 2015 15:04:56 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=22165 Continue reading ]]>
  • Doug Hughes speaks after his hearing at the U.S. District Court. (Nick Kariuki/MEDILL NSJI)
    Doug Hughes speaks after his hearing at the U.S. District Court. (Nick Kariuki/MEDILL NSJI)

WASHINGTON, May 21 (UPI) — Doug Hughes, the Florida mail carrier who landed his gyrocopter on the U.S. Capitol’s West Lawn last month, appeared in court on Thursday to plead not guilty to all six federal charges against him.

Among the charges against Hughes are two felonies: operating an aircraft without a license and flying an unregistered aircraft. He faces up to nine and half years in prison.

“As long as I’m free I’m going to introducing voters to groups with solutions to problems of corruption that the vast majority of voters recognize and oppose.” Hughes said after the hearing.

On April 15, tax day, Hughes piloted the low-altitude aircraft from Gettysburg, Pa., to Washington, landing on the Capitol lawn.

Hughes carried 535 two-page letters, one for every member of Congress, highlighting the need for campaign finance reform because of what he sees as the corrosive effect of money politics. He described his actions as an act of civil disobedience.

“I’ll never do anything like this again, but I would do it exactly the way I did,” Hughes said.

Capitol Police arrested Hughes after he landed the small aircraft. He was later released on bail and remained under house arrest in Ruskin, Fla., where he wore an ankle monitor.

Mag. Judge Alan Gray allowed Hughes to move within Hillsborough Count,y where he lives, though he still must wear the monitor.

The judge also refused to let Hughes visit the Capitol, White House and other areas in Washington, which he was banned from doing immediately after the incident.

Hughes was also put on administrative leave from his job at the U.S. Postal Service.

The postal worker’s protest has raised concerns from lawmakers about the security of the Capitol. Hughes flew across 30 miles of some of the nation’s most restricted airspace on his route to D.C.

The Tampa Bay Times wrote about Hughes’s protest plans before the flight. He also informed the Secret Service and other news organizations by email and live-streamed the event on The Democracy Club, a website dedicated to congressional reform.

Hughes has stated his frustration at the focus on the security concerns raised, rather than the reasons for his flight:

“I have faith in a jury of my peers and will accept whatever consequence I must,” Hughes wrote in an op-ed in The Washington Post. “I simply hope by putting my freedom on the line, others might realize how precious their freedom is and join those of engaged in this fight to preserve and protect our government of, by and for the people.”

Members of CODEPINK, the women-led grassroots activist group, presented Hughes with a framed stamp after the hearing.


Published in conjunction with UPI Logo

]]>
A How-to Guide for Encrypting and Protecting Digital Communications using PGP http://nationalsecurityzone.medill.northwestern.edu/blog/2015/05/11/a-how-to-guide-for-encrypting-and-protecting-digital-communications-using-pgp/ Mon, 11 May 2015 17:10:13 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=21917 Continue reading ]]> BY AARON RINEHART FOR THE MEDILL NSJI

“Encryption works. Properly implemented strong crypto systems are one of the few things that you can rely on. Unfortunately, endpoint security is so terrifically weak that NSA can frequently find ways around it.”

— Edward Snowden, answering questions live on the Guardian’s website

From surveillance to self-censorship, journalists are being subjected to increased threats from foreign governments, intelligence agencies, hacktivists and other actors who seek to limit or otherwise manipulate the information they possess. The notorious Edward Snowden stressed to the New York Times in an encrypted interview the importance of encryption for journalists: “It should be clear that [for an] unencrypted journalist to source communication is unforgivably reckless.” If journalists are communicating insecurely and without encryption, they put themselves, their sources and their reporting at unnecessary levels of risk. This sort of risky behavior may send the wrong message to potential key sources, like it almost did when Glenn Greenwald almost missed out on the landmark story of National Security Agency surveillance set out in the Snowden documents because he wasn’t communicating via encryption.

The aim of this how-to guide is to provide a clear path forward for journalists to protect the privacy of their reporting and the safety of their sources by employing secure communication methodologies that are proven to deliver.

How and When Should I Encrypt?

Understanding the basics of encryption and applying these tools and techniques to a journalist’s reporting is rapidly becoming the new normal when conducting investigative research and communicating with sources. It is, therefore, just as vital to know when and how to encrypt sensitive data as it is to understand the tools needed to do it.

In terms of when to encrypt, confidential information should be both encrypted “At-Rest” and “In-Transit” (or “In-Motion”). The term data-at-rest refers to data that is stored in a restful state on storage media. An example of this is when a file is located in a folder on a computer’s desktop or an email sitting in a user’s in-box. The term data-in-transit describes the change of data from being in a restful state to being in motion. An example of data-in-transit is when a file is being sent in an email or to a file server. With data-in-transit, the method of how the data is being transmitted from sender to receiver is the primary focus — not just the message. This is illustrated more effectively in the example of using a public wireless network in that if the network is not setup to use strong encryption to secure your connection, it may be possible for someone to intercept your communications. The use of encryption in this use case demonstrates how sensitive data, when not encrypted while in transit, can be compromised.

Methods for protecting Data-at-Rest and Data-in-Transit

Data-at-Rest can be protected through the following methods.

One suggested methodology is to encrypt the entire contents of the storage media, such as a hard drive on a computer or an external drive containing sensitive material. This method provides a higher level of security and can be advantageous in the event of a loss or theft of the storage media.

A second method – which should be ideally combined with the first method – is to encrypt the files, folders and email containing sensitive data using Pretty Good Privacy (or PGP) encryption. PGP encryption also has the added benefit of protecting data-in-transit, since the data stays encrypted while in motion

The name itself doesn’t inspire much confidence, but PGP or “Pretty Good Privacy” encryption has held strong as the preferred method by which individuals can communicate securely and encrypt files.

The concepts surrounding PGP and getting it operational can often seem complex, but this guide aims to make the process of getting started and using PGP clearer.

PGP Essentials: The Basics of Public and Private Keys

Before diving too deeply into the software setup needed to use PGP, it is important to understand a few key fundamentals of how PGP encryption works.

Within PGP and most public-key cryptography, each user has two keys that form something called a keypair. The reason the two keys are referred to as a keypair is that the two are mathematically linked.

The two keys used by PGP are referred to as a private key, which must always be kept secret, and a public key, which is available for distribution to people with whom the user chooses to communicate. Private keys are predominantly used in terms of email communications to decrypt emails from a sender. Public keys are designed for others to use to encrypt mail to the user.

In order to send someone an encrypted email, the sender must first have that recipient’s public key and have established a trusted relationship. Most encryption systems in terms of digital communications are based on establishing a system of trust between communicating parties. In terms of PGP, exchanging public keys is the first step in that process.

Key Management: Best Practices

Regardless of whether the user is using the OpenPGP standard with GNU Privacy Guard (GPG) or another derivative, there are a few useful points to consider in terms of encryption key management.

Private Keys are Private!

The most important concept to remember is that private Keys should be kept private. If someone compromises the user’s private key, all communications would be trivial to intercept.

Generating Strong Encryption Keys

When generating strong private/public keypairs there are some important things to remember:

  • Utilize Large Key Sizes and Strong Hashing Algorithms

It is recommended that when generating a keypair to make the key size at least 4096bit RSA with the SHA512 hashing algorithm. The encryption key is one of the most important pieces in terms of how the encryption operations are executed. The key is provided to present a unique “secret” input that becomes the basis for the mathematical operations executed by the encryption algorithm. A larger key size increases the strength of the cryptographic operations as it complicates the math due to the larger input value. Thus, it makes the encryption more difficult to break.

  • Set Encryption Key Expiration Dates

Choose an expiration date less than two years in the future.

  • Strong Passphrase

From a security perspective, the passphrase is usually the most vulnerable part of the encryption procedure. It is highly recommended that the user choose a strong passphrase.

In general terms, the goal should be to create a passphrase that is easy to remember and to type when needed, but very hard for someone else to guess.

A well-known method for creating strong, but easy to remember, passwords is referred to as ‘diceware,’. Diceware is a method for creating passphrases, passwords and other cryptographic variables using an ordinary die from a pair of dice as a random number generator. The random numbers generated from rolling dice are used to select words at random from a special list called the Diceware Word List. The recommendation when using diceware to create a PGP passphrase is to use a minimum of six words in your passphrase. An alternative method for creating and storing strong passphrases is to use a secure password manager such as KeePass.

Backing Up Private Keys

Although a journalist may be practicing good security by encrypting sensitive information, it would be devastating if a disruptive event – such as a computer hardware failure – caused them to lose their private key, as it would be near-impossible to decrypt without it. When backing up a private key, it is important to remember that is should only be stored on a trusted media, database or storage drive that is preferably encrypted.

Public Key Servers

There are several PGP Public Key servers that are available on the web. It is recommended for journalists upload a copy of their public keys to public key servers like hkp://pgp.met.edu to open their reporting up to potential sources who wish to communicate securely. By uploading a copy of the public key to the key server, anyone who wants to communicate can search by name, alias or email address to find the public key of the person their looking for and import it.

Validating Public Keys: Fingerprints

When a public key is received over an untrusted channel like the Internet, it is important to authenticate the public key using the key’s fingerprint. The fingerprint of an encryption key is a unique sequence of letters and numbers used to identify the key. Just like the fingerprints of two different people, the fingerprints of two different keys can never be identical. The fingerprint is the preferred method to identify a public key. When validating a public key using its fingerprint, it is important to validate the fingerprint over an alternative trusted channel.

For example, if a journalist receives a public key for a source on a public key server, it is important for them to validate the key by either communicating in person, calling them over secure phone or via an alternate communication channel. The purpose of key validation is to guarantee that the person being communicated with is the key’s true owner.

Adding PGP Public Key Fingerprint to Twitter

For journalists, its important to ensure sources can quickly validate their public keys that they retrieve from Public Key servers. A common method to convey a PGP public key to the public is to tweet the public key fingerprint and link to that tweet in to the bio.

Another method is to link directly to the PGP key on a public keyserver (like MIT’s) and to provide a copy of the key fingerprint in the bio, like this example with Barton Gellman.

1

GNU Privacy Guard: Encrypting Email with GPG

Hold up, stop and wait a minute: I thought the topic of discussion was PGP.

Is GPG a typo?

No. In fact, GPG (or the GNU Privacy Guard) is the GPL-licensed alternative to the PGP suite of encryption software. Both GPG and PGP utilize the same OpenPGP standard and are fully compatible with one another.

Getting Started with GPG: From Setup to Secure

A Step-by-Step Guide to Setting up GPGTools on Apple OSX

Tutorial Objectives

  • How to install and configure PGP on OS X
  • How to use PGP operationally

Install the GPGTools GPG Suite for OS X

This step is simple. Visit the GPGTools website and download the GPG Suite for OS X. Once downloaded, mount the DMG and run the “Install.”

2

Select all modules and, then, press “Install.”

3

Generating a New PGP key

When the installer completes, a new app called “GPG Keychain Access” will launch. A small window will pop up immediately and say: “GPG Keychain Access would like to access your contacts.” Press “OK.”

4

After pressing “OK,” a second window will pop up that says “Generate a new keypair.” Type in your name and your email address. Also, check the box that says “Upload public key after generation.” The window should look like this:

5

Expand the “Advanced options” section. Increase the key length to 4096 for extra security. Reduce the “Expiration date” to 1 year from today. The window should look like this:

6

Press “Generate key.”

After pressing “Generate key,” the “Enter passphrase” window will pop up.

Okay, now this is important

The Importance of Good Passphrases

The entire PGP encryption process will rest on the passphrase that is chosen.

First and foremost: Don’t use a passphrase that other people know! Pick something only you will know and others can’t guess. Once you have a passphrase selected, don’t give it to other people.

Second, do not use a password, but rather a passphrase — a sentence. For example, “ILoveNorthwesternU!” is less preferable than “I graduated from Northwestern U in 1997 and it’s the Greatest U on Earth?!” The longer your passphrase, the more secure your key.

Lastly, make sure your passphrase is something you can remember. Since it is long, there is a chance that you might forget it. Don’t. The consequences to that will be dire. Make sure you can remember your passphrase. In general there are several methodologies by which you can employ to store your passphrase to ensure its safekeeping. One such method would be to make use of a password manager like “KeePass”, an open source encrypted password database that securely stores your passwords.

Once you decide on your passphrase, type it in the “Enter passphrase” window. Turn on the “Show typing” option, so you can be 100% sure that you’ve typed in your passphrase without any spelling errors. When everything looks good, press “OK:”

7

You will be asked to reenter the passphrase. Do it and press “OK:”

8

You will then see a message saying, “We need to generate a lot of random bytes…” Wait for it to complete:

9

Your PGP key is ready to use:

PGPkeyready

Setup PGP Quick Access Shortcuts

Open System Preferences, select the “Keyboard” pane and go to the “Shortcuts” tab.

On the left hand side, select “Services.” Then, on the right, scroll down to the subsection “Text” and look for a bunch of entries that start with “OpenPGP:”

Go through each OpenPGP entry and check each one.

10

Bravo! You’re now done setting up PGP with OpenGPG on OS X!

Now, let’s discuss how to use it.

How to send a secure email

To secure an email in PGP, you will sign and encrypt the body of the message. You can just sign or just encrypt, but combining both operations will result in optimum security.

Conversely, when you receive a PGP-secured email, you will decrypt and verify it. This is the “opposite” of signing and encrypting.

Start off by writing an email:

  1. Select the entire body of the email and “Right Click and Go to Services -> OpenPGP: Sign” to sign it.
  1. Open the GPG Keychain Access app. Select “Lookup Key” and type in the email address of the person you are sending your message to. This will search the public keyserver for your source’s PGP key.

If your source has more than one key, select his most recent one.

You will receive a confirmation that your source’s key was successfully downloaded. You can press “Close.”

You will now see your source’s public key in your keychain.

  1. You can now quit GPG Keychain Access and return to writing the email.
  1. Select the entire body of the email (everything, not just the part you wrote) and “Right Click and Go to Services -> OpenPGP: Encrypt” to encrypt it. A window will pop up, asking you who the recipient is. Select the source’s public key you just downloaded and press “OK.”
  1. Your entire message is now encrypted! You can press “Send” safely.

As a reminder, you will only need to download your source’s public key once. After that, it will always be available in your keychain until the key expires.

How to receive a secure email

With our secure message sent, the recipient will now want to decipher it. For the sake of this step, I will pretend that I am the recipient.

I have received the message:

email

  1. Copy the entire body, from, and including, “—–BEGIN PGP MESSAGE—“, to, and including, “—–END PGP MESSAGE—“. Open a favorite text editor, and paste it:

email2

  1. Select the entire text, “Right click and select Services – OpenPGP – Decrypt” – to decrypt the message. You will immediately be prompted for your PGP passphrase. Type it in and press “OK:”

email3

  1. You will now see the decrypted message!

email4Next, you can verify the signature.

  1. Highlight the entire text and “Right Click and Go to Services -> OpenPGP: Verify”. You will see a message confirming the verification.
  2. Press “OK.”

Setting up GPG4Win on Windows

Tutorial Objectives

  • How to install and configure PGP on a PC
  • How to use PGP operationally

Installing the GPG4Win GPG Suite

This step is simple.

  1. Visit the GPG4win website and download the GPG Suite for OS X.
  2. Once downloaded, run the “Install”.

GPG1

Download Install File

GPG2

  1. Double-Click on the downloaded file to begin the installation wizard.
  2. Select the components to install, but keep it simple by installing all components except for Claws Mail.
  3. Select “Next”

A brief description of each component:

gpg3

  • Kleopatra – a certificate manager
  • GPA – another certificate manger
  • GpgOL  – a plugin for Outlook
  • GPGEX – an extension for Windows Explorer
  • Claw-Mail – a lightweight email program with GnuPG support built-in
  • Gpg4win Compendium  – a manual
  1. Select desired preferences and click “Nextgpg5

5. Click “Finish” to exit the install wizard.

Setting up GPG4Win using Thunderbird and Enigmail

Enigmail, a play on words originating from the Enigma machine used to encrypt secret messages during World War I, is a security extension or add-on to the Mozilla Thunderbird Email software. It enables you to write and receive email messages signed and/or encrypted with the OpenPGP standard. Enigmail provides a more simplified method for sending and receiving encrypted email communications. This step-by-step guide will help you get started installing and configuring the extension.

  1. Open Thunderbird and navigate to the Add-Ons Manager under the “Tools” menu.
  2. In the search dialog box, type “Enigmail.”Now, a list of Add-Ons will be available.
  3. Select the Enigmail add-on from the list.
  4. Click the “Install” button.

ENIG1

 

  1. There should now be a message indicating, “Enigmail will be installed after you restart Thunderbird.” Proceed with the installation by Clicking on the words “Restart Now

ENIG2

 

  1. After the Thunderbird application restarts, Enigmail should look like the image below. Proceed with configuring the add-on by Selecting Enigmail from the list.

ENIG3

  1. Select “I prefer a standard configuration (recommended for beginners)” and click “Next”.

Generating a Public/Private Keypair.

  1. Select the Account to generate the keys for.
  2. Enter in a strong passphrase
    • If the passphrase isn’t strong enough, the Passphrase quality meter will indicate that with a red- or yellow-colored bar (vs. the green one shown in the image below).
  3. Re-enter the strong passphrase to confirm.

keypair1

 

The Key Generation Process will generate a series of data based on random activity and assign it to the randomness pool for which to generate the keypair.

keypair2

 

 

  1. Save the revocation key to a trusted and safe, separate device (storage media)!

The revocation certificate can be used to invalidate a public key in the event of a loss of a secret (private) key.

keypair3

 

Key Management / View Key in Enigmail

management1

  1. Change the expiration date (suggested <2 years)

management2

 

  1. Upload Key To Public Keyserver (like hkp://pgp.mit.edu).

Public Keyserver Lookup

Look up the Public Keys of other people on public keyserver directly from within Enigmail.

  1. Select “Search for Keys” from the “Keyserver” dropdown menu.
  2. Enter in the Email Address or <FirstName><space><LastName> of the persons name that is being looked up.

A good test for this function is to try searching for Glenn Greenwald.

management3

 

Notice how many active public keys Glenn Greenwald has. This could be intentional, but it can also happen when setting up keys on a new device or email client and 1. Forgot the private key passphrase 2. Lost the key revocation file or forgotten the passphrase to unlock it.

The problem is that anyone contacting the user for the first time will have to figure out which key is the correct one to use. It also becomes a security risk because any one of those unused, but active, keys could be compromised, and result in adversaries accessing communications.

MORAL Of THE STORY: Set an expiration date, manage the revocation key file and manage passphrases.

Revoking a Key

  1. Right-click on the key and click on Key Properties.

revoke1

 

  1. At the bottom of the window Click on the ”Select Action” dropdown menu and Select “Revoke Key.”

revoke2

  1. A dialog box will pop up asking for the Private Key’s unique passphrase. Enter the passphrase for the key that is being revoked.

revoke3

  1. Once completed, the user will receive an ‘Enigmail Alert’ indicating that the key has been revoked. The alert warns the user that ‘if your key is available on a keyserver, it is recommended to re-upload it, so that others can see the revocation’. It is important to update the public keyservers to ensure that sources are aware of revoked keys and new keys on each account.

revoke4

 

Operationalizing GPG: A Pragmatic Approach

What do encrypt, decrypt, sign, and verify mean?

  • Encrypt takes the user’s secret key and the recipient’s public key, and jumbles a message. The jumbled text is secure from prying eyes. The sender always encrypts.
  • Decrypt takes an encrypted message, combined with the user’s secret key and the sender’s public key, and descrambles it. The recipient always decrypts. Encrypt and decrypt can be thought of as opposites.
  • Signing a message lets the receiver know that the user (the person with the user’s email address and public key) actually authored the message. Signing also provides additional cryptographic integrity by ensuring that no one has interfered with the encryption. The sender always signs a message.
  • Verifying a message is the process of analyzing a signed message to determine if the signature is true. Signing and verifying can be thought of as opposites.

When should someone sign a message? When should they encrypt?

If it is unnecessary to sign and encrypt every outgoing email, when should the user sign? And when should the user encrypt? And when should the user do nothing?

There are three sensible choices when sending a message:

  • Do nothing. If the contents of the email are public (non-confidential), and the recipient does not care whether the user or an impostor sent the message, then do nothing. The user can send the message as they’ve sent messages their entire life: in plain text.
  • Sign, but don’t encrypt. If the contents of the email are public (non-confidential), but the recipient wants assurance that the suspected sender (and not an impostor) actually sent the message, then the user should sign but not encrypt. Simply follow the tutorial above, skipping over the encryption and decryption steps.
  • Sign and encrypt. If the contents of the email are confidential, sign and encrypt. It does not matter whether the recipient wants assurance that the user sent the message; always sign when encrypting.

For a majority of emails that the user may send, encryption is just not always necessary. The remainder of the time, the user should sign and encrypt.

Whenever there is confidential information — such as sensitive reporting information, source address and name information, credit card numbers, bank numbers, social security numbers, corporate strategies or intellectual property — users should sign and encrypt. In terms of confidential information, users should err on the side of caution and sign and encrypt gratuitously rather than doing nothing and leaking sensitive information. As for the third option, users can sign, but do not encrypt.

Best Practices in Information Security

Despite best practices regarding the operational usage of PGP encryption, a disregard for the fundamentals of information security can still put a journalist’s communications in peril. It doesn’t matter how strong the encryption is if the user’s laptop has already been compromised, and is only a matter of time before the journalists’ encrypted communication method is in jeopardy.

Below is a short list of some high-level information security best practices. For more information on this subject, see Medill’s National Security Zone Digital Security Basics for Journalists.

  • Good password management

Journalists should not only create strong passwords, but also avoid using the same password for anything else. Consider using a password manager.

  • Keep software up to date.

Update software frequently. This helps thwart a majority of attacks to your system.

  • End-Point Security Software

Make use of antivirus and anti-malware software.

  • Be wary of odd emails and accompanying attachments.

When in doubt, don’t click. The goal of most phishing attacks is to either get you to download a file or send you to a malicious website to steal your username and password. If it doesn’t seem right or doesn’t make sense, try reaching out to the person via an alternate communication method before clicking.

  • Stay away from pirated software.

Nothing is truly free, as these software packages can often come with unintended consequences and malicious code packaged with them.

GPG Alternatives

  • CounterMail: a secure online email service utilizing PGP without the complexity of complicated key management.
  • DIMEDark Internet Mail Environment: a new approach and potential game-changer to secure and private email communications.

Additional Resources

Glossary of Terms

Keyword Definition
Ciphertext Ciphertext is encrypted text. Plaintext is what you have before encryption, and ciphertext is the encrypted result. The term cipher is sometimes used as a synonym for ciphertext, but it more properly means the method of encryption rather than the result.
Data-at-Rest Data at rest is a term that is sometimes used to refer to all data in computer storage while excluding data that is traversing a network or temporarily residing in computer memory to be read or updated.
Data-In-Transit Data in Transit is defined as data no longer at a restful state in storage and in motion.
Digital Signature A digital signature is a mathematical technique used to validate the authenticity and integrity of a message, software, or digital document.
Encryption Encryption is the conversion of electronic data into another form, called ciphertext, which cannot be easily understood by anyone except authorized parties.
Fingerprint In public-key cryptography, a public key fingerprint is a short sequence of bytes used to authenticate or look up a longer public key. Fingerprints are created by applying a cryptographic hash function to a public key.
Key In cryptography, a key is a variable value that is applied using an algorithm to a string or block of unencrypted text to produce encrypted text, or to decrypt encrypted text.
Password Manager A password manager is a software application that helps a user store and organize passwords. Password managers usually store passwords encrypted, requiring the user to create a master password; a single, ideally very strong password which grants the user access to their entire password database.
Private Key In cryptography, a private or secret key is an encryption/decryption key known only to the party or parties that exchange secret messages. In traditional secret key cryptography, a key would be shared by the communicators so that each could encrypt and decrypt messages. The risk in this system is that if either party loses the key or it is stolen, the system is broken. A more recent alternative is to use a combination of public and private keys. In this system, a public key is used together with a private key.
Public Key In cryptography, a public key is a value provided by some designated authority as an encryption key that, combined with a private key derived from the public key, can be used to effectively encrypt messages and digital signatures.

Term Definitions provided by TechTarget.com, Webopedia.com, and Wikipedia.org

]]>
FOIA update: USDB releases Manual for the Guidance of Inmates (USDB Regulation 600-1, Nov. 2013) http://nationalsecurityzone.medill.northwestern.edu/blog/2015/04/27/foia-update-usdb-releases-manual-for-the-guidance-of-inmates-usdb-regulation-600-1-nov-2013/ Mon, 27 Apr 2015 19:00:32 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=21584 Continue reading ]]> WASHINGTON — On Monday, the United States Disciplinary Barracks’ Directorate of Inmate Administration released “USDB Regulation 600-1, Nov. 2013” entitled “Manual for the Guidance of Inmates” to the Medill National Security Journalism Initiative in response to an April 17 Freedom of Information Act request.

The 141-page document serves as the official rulebook for the treatment and behavior of inmates held at the military prison (including WikiLeaks firestarter Chelsea Manning) and addresses everything from media contact with inmates to rules regarding their appearance and hygiene.

The FOIA request was intended to increase transparency regarding the U.S. Army’s regulation of USDB inmates held at Fort Leavenworth, to better inform the press about rules regarding their contact with prisoners and to shed light on the status of civil liberties within the prison’s walls.

You can view the entire document below:

]]>
The drone debate: Does the coming swarm of flying gadgets require new privacy laws? http://nationalsecurityzone.medill.northwestern.edu/blog/2015/04/23/the-drone-debate-does-the-coming-swarm-of-flying-gadgets-require-new-privacy-laws/ Thu, 23 Apr 2015 18:43:53 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=21561 Continue reading ]]> CyPhy drone (Image courtesy of CyPhy Works)

CyPhy drone (Image courtesy of CyPhy Works)

Noticeably missing from the recommendations unveiled earlier this year were any privacy oversights. For the Electronic Privacy Information Center (EPIC), the plaintiffs in the suit against the FAA, that was inexcusable.

The advocacy group’s website site is full of unnerving facts about camera-wielding drones. They can be equipped with facial recognition, license plate scanners, the capacity to track multiple targets, and the ability to operate at distances and heights making them impossible to detect. Drones are “designed to undertake constant, persistent surveillance to a degree that former methods of video surveillance were unable to achieve,” according to EPIC.

The courts are not the only avenue to affect policy change at the FAA. Public comment on the framework will be accepted until this Friday.

But many experts argue that the FAA isn’t the governing body that should be charged with ensuring drones don’t violate privacy. What’s more, others – chiefly drone-makers and their advocates – question whether unmanned aerial vehicles, or UAVs, even require a new set of privacy standards, saying that existing laws are already enough.

In a privacy impact assessment issued alongside the proposed framework, the FAA stated that while it “acknowledges that privacy concerns have been raised about unmanned aircraft operations … these issues are beyond the scope of this rulemaking.”

While some privacy advocates are worried that the omission may allow for invasive surveillance from commercial or government drones operating inside the US, the drone community said those concerns are more of a red herring than anything else.

“The FAA has wisely backed off all privacy issues [because]there’s no need for a new federal privacy bureaucracy [when]states already have protections in place,” says Charles Tobin, a privacy rights lawyer at the law firm Holland & Knight, who represents a coalition of media outlets advocating for drone usage for the purposes of journalism.

“The laws that are on the books are all technology agnostic. They apply to computers, they apply to still cameras, they apply to wireless microphones, they apply to video cameras … and there’s no reason that they can’t be applied – as already written – to UAV,” he says.

Relying on existing legal protections should be the obvious choice, says Brendan Schulman, head of the Unmanned Aircraft Systems practice at the law firm Kramer Levin in New York.

Nicknamed “the Drone Lawyer,” Mr. Schulman says that “if the concern is physical intrusion or inappropriate photographs, state law governing offenses such as trespass, stalking, peeping or unlawful surveillance … apply.” That means that what people are most fearful of – being stalked, harassed, or surveilled by a drone, or being victimized by a peeping tom behind a drone – are already acts bound by law.

Simply put: the states have things covered, he says.

REGULATORS LAG BEHIND TECHNOLOGY
A presidential memorandum issued the same day as the FAA’s proposed regulations relays the responsibility to “develop a framework regarding privacy, accountability, and transparency for commercial and private [unmanned aerial systems]use” to the Department of Commerce. The memo states that the department must initiate a “multistakeholder engagement process” within 90 days of the memo’s release – so it must begin work by mid-May.

But government trying to regulate a specific piece of technology is not the best approach, says Matt Waite, professor of journalism and founder of the Drone Journalism Lab at the University of Nebraska-Lincoln, which explores the ways in which drones can be used to further journalistic aims.

“As we are already seeing, the government lags way behind technology when it comes to laws that would deal with that technology. It’s taken the FAA a long time to come up with [proposed]rules for these drones and they’re flying around right now. They’re being used for commercial purposes even though the FAA says, ‘No, you can’t do that.’ ”

Mr. Waite says it’s important to determine exactly what people can’t do – what actions need to be stopped. “We need to start thinking about what we consider a reasonable expectation of privacy in our modern times. And if that’s not allowing [me to]photograph [someone]streaking in their backyard, then that’s great. We can say I can’t do that. But it shouldn’t matter how I do that, [just that]you don’t want me to do it.”

It’s about recognizing that once privacy has been violated, how it was violated is no longer important, says Helen Greiner, chief executive officer of Massachusetts robotics and drone company CyPhy Works. She says that although she understands the privacy concerns related to the commercial use of drones, those concerns are often misdirected: “It’s not a drone issue. It’s a camera issue. In that way, it’s kind of a red herring.”

“You need to go to the real issue, which is pointing cameras at things they shouldn’t be pointed at,” Ms. Greiner says. “And if we’re going to talk about privacy with cameras, it should be for all cameras … whether they’re on a drone or a balloon.”

She says she doesn’t worry that public fear might hurt her business because the drones sold by CyPhy Works are used to perform specific commercial functions. “They may be used to survey a property or a facility, for example,” but they’re not being used to capture footage surreptitiously, Greiner says.

“I believe privacy is an important issue and that it should be regulated, but rules already exist,” she explains. She says it’s unlikely that fears related to the perceived loss of privacy will bog down final passage of FAA regulations – something she’s anxiously awaiting: “It might be wishful thinking, but I don’t foresee a tightening in terms of the finalized regulations.”

THE CASE FOR PRIVACY POLICIES
A public commentary period on the proposed regulations expires Friday, but there’s no firm deadline for when the FAA must have finalized regulations in place. Experts think it could take two years, possibly longer – which means the waiting game has only begun. It also means that commercial drone use will remain technically illegal for the duration, outside of a handful of exemption-type permissions granted by the FAA.

Amie Stepanovich, senior policy counsel for privacy advocacy group Access Now, says that there’s room for improvement when it comes to ensuring personal privacy. That’s because drone technology is in a league of its own: “Drones have [the]capacity to bring a bunch of different surveillance technologies onto a singular platform and to reach into areas that other vehicles have not been able to get to.”

Ms. Stepanovich says that limitations should be put in place to restrict the ways in which government agencies can use drone technology for the purpose of surveillance. “We need things that will, for example, protect users’ location information from being collected and tracked,” she says. “It comes back to tracking people over time without a warrant and being able to pinpoint their exact location. … and we need to make sure that that information is adequately protected.”

But she’s also a fan of technology agnosticism. She says that whatever restrictions are put in place, they should not be drawn up as drone-specific. “There are several other different kinds of technologies that are coming out,” she says, referring to Stingray trackers that are now being used by law enforcement agencies to gather data from cellphones.

The presidential memo issued in conjunction with the FAA’s proposal states that agencies must “comply with the Privacy Act of 1974, which, among other things, restricts the collection and dissemination of individuals’ information that is maintained in systems of records, including personally identifiable information.”

Although the White House’s assurance that government agencies will be held accountable to legacy privacy standards is a start, Stepanovich recommends further attribution and transparency.

“The FAA has a publicly accessible database of who is able to fly airplanes in any specific geographic area in the United States. But they haven’t made a similar commitment to do that for drone operators,” Stepanovich says. She calls that a double standard.

People won’t know which agency, company, or person is behind the remote control of the drone flying over their homes, Stepanovich says. “So the FAA definitely has a role to play in protecting privacy,” she says.

Stepanovich suggests the FAA incorporate a registry: “We’re talking about transparency, requiring that drone users register what technology they are deploying on their drones, and what capacity these drones will have. This just gets at making sure people are aware of what’s going on in their own area.”

]]>
Minimizing your digital trail http://nationalsecurityzone.medill.northwestern.edu/blog/2015/03/21/minimizing-your-digital-trail/ Sat, 21 Mar 2015 14:55:50 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=21210 Continue reading ]]> WASHINGTON — In popular culture, going “off the grid” is generally portrayed as either unsustainable or isolated: a protagonist angers some omniscient corporate or government agency and has to hole up in a remote cabin in the woods until he can clear his name or an anti-government extremist sets up camp, also in the middle of nowhere, living off the land, utterly cut off from society at large.

But is there a way to live normally while also living less visibly on the grid? What steps can you take to reduce your digital footprint that don’t overly restrict your movements?

What is a digital footprint?

Your digital footprint is the data you leave behind when you use a digital service—browse the web, swipe a rewards card, post on social media. Your digital footprint is usually one of two classifications: active or passive.

Your active digital footprint is any information you willingly give out about yourself, from the posts you put up on Facebook to the location information you give to your local mass transit system when you swipe your transit pass.

By contrast, your passive digital footprint is information that’s being collected about you without your express knowledge or authorization, for example, the “cookies” and “hits” saved when you visit a website. When you see personalized ads on Google, for example, those are tailored to you through collection of your personal preferences as inferred through collection of your passive digital footprint.

To assess my digital footprint, I looked through my wallet, my computer and my phone.

The footprint in your wallet

First, the wallet: I have several rewards cards, each representing a company that has a record of me in its database that shows how often I shop and what I buy, which is linked to my name, address, email and birthday—plus a security question in case I forget my password, usually my mother’s middle name.

While I would consider this information fairly benign—they don’t have my credit card information or my Social Security number—these companies can still make many inferences about me from my purchases. CVS, for example, could probably say fairly accurately if I’m sick based on my purchase of medications, whether I’m sexually active based on birth control purchases and any medical conditions I may have based on my prescription purchases.

If I wanted to minimize my digital footprint, I could terminate all my rewards accounts and refrain from opening any more. For me, though, it’s worth allowing these companies to collect my information in order to receive the deals, coupons and specials afforded me as a rewards member.

Next up is my transit pass, which is linked to my name, local address and debit card. The transit authority has a record of every time I swipe my way onto a city bus or train, a record of my movements linked to my name.

A minimal-footprint alternative to a transit pass is single-use fare cards. If purchased with cash, they would leave no record of my travels linked to my name. While this, like the rewards cards, is feasible, it’s far less convenient than the pass —so much less so that again I’m willing to compromise my privacy.

My debit card and insurance card are the two highest-value sources of personal information, but both are utterly necessary—living half a country away from my local credit union, I need my debit card to complete necessary transactions. My medical insurance card, relatively useless to identity thieves unless they have an ID with my name on it, does represent another large file in a database with my personal information—doctors’ visits, prescriptions and hospital stays for the past several years. People with just the physical card, not my license or information, can’t do much with that, but if a hacker gets to that information it could be very damaging.

No driver’s license? No credit card?

To minimize my digital footprint, then, I could pare down my wallet to just the absolute necessities—my insurance card, debit card and my license. You didn’t talk about your license

Computer footprint

If I’m guilty of leaving a large digital footprint, all my worst infractions probably happen across the Web.

Between Facebook, Twitter and Pinterest, I’ve broadcast my name, picture, email, hometown and general movements, if not my specific location, on each of those sites. Of the three, Facebook certainly has the most comprehensive picture of my life for the past seven years—where I’ve been, with whom, what I like and what I’m thinking.

If I wanted to take myself as far off the grid as feasible, simply deactivating the accounts wouldn’t work—Facebook keeps all your information there for you to pick up where you left off. You can permanently delete it with no option for recovery, but some information isn’t stored just on your account—messages exchanged with friends, for example, or any information shared with third-party apps.

If you keep using social networking sites, privacy policies change frequently, meaning that even if you choose the most restrictive privacy settings, you often have to go back and re-set them whenever the company changes its policy. Apps complicate things even further, farming out much of your information to third-party companies with different privacy policies.

Even if you’re vigilant about your privacy settings and eschew apps, your profile is only as private as your most public Facebook friend, said Paul Rosenzweig, a privacy and homeland security expert.

When shopping online, it’s important to check the privacy statements and security policies of the companies you’re using. If possible, purchase gift cards to the specific retailer or from credit card companies and use those to shop, so you don’t leave your credit card information vulnerable to breaches like that of Target.

I know that email is not my friend when it comes to online privacy, but I can’t operate without it.  I use Gmail on Google Chrome for my email, so I installed Mymail-Crypt. It’s one of several “pretty good protection,” or PGP, encryption programs. Using it, my messages appear to be a jumbled bunch of letters until the recipient decrypts it using their private key, which I can save to a key server, like the aptly named Keyserver, where it’s searchable by my email or key ID. I can then link to it on my personal profiles such as Facebook or LinkedIn. People can then send an encrypted email to me using my public key that cannot be read without my private key to unlock it. I’ve also started encrypting my G-Chats using Off the Record chat.

Email can be used against you. Phishers have started to send more sophisticated emails imitating individuals or companies you trust in order to convince you to give up information like your social security number or credit card data. Drew Mitnick a junior policy counselor at digital rights advocacy group Access Now, said you need to be vigilant no matter what you’re doing on the internet.

“Ensure that whoever you’re dealing with is asking for appropriate information within the scope of the service,” he said. In other words, Gap shouldn’t be asking for your Social Security number.

To limit cookies and other data collection during your Internet use, you can open incognito windows in Google Chrome. In incognito mode, the pages you view don’t stay in your browser or search histories or your cookie store—though your Internet service provider and the sites you visit still have a record of your browsing.

Finally, encrypt your hard drive. Privacy laws vary from state to state and country to country so the best way to ensure that you’re protected no matter where you are is to encrypt your computer and be careful not leave it where someone can mess with it, said Mitnick.

Phone footprint

Another source of vulnerability for many people is a smartphone. As long as you have a phone, you’re on the grid—phone companies can triangulate your position using cell phone towers and location services, and they log your calls. Beyond that, though, there are steps you can take to limit information people can access about you using your phone.

First, be judicious when installing apps. Carefully read the permissions an app requires for installation, and if you’re uncomfortable with them, don’t install it! Read privacy policies and terms of use so you know what data the app keeps on you.

Because I have a Windows phone, many of the basic apps (alarms, maps, Internet Explorer, music, and Microsoft Office) are Microsoft apps and use their terms of use and privacy policy, which is pretty good about not sharing my information with third parties. They also delete your account data after you delete their app, though it may take a few weeks.

I have several social apps, such as the aforementioned Facebook and Pinterest, for which the privacy settings are fairly similar to their desktop counterparts—not very private—with the added bonus of them now having access to my location and phone number. It’s entirely possible—and advisable, if you’re trying to leave a minimal footprint—to live without these apps, but I choose not to.

I’m selective about the apps I install on my phone. Aside from the apps that come with the phone and my social media apps, I only have Uber—and that has a lot of access to my phone. According to the app information, Uber can access my contacts, phone identity, location, maps, microphone, data services, phone dialer, speech and web browser. That’s a lot, and not all of it seems necessary—why does Uber need my contacts? Again, though, I chose to compromise my privacy on this one because the convenience, for me, outweighed the risk.

A precaution I’ve always taken is turning off my location service unless I need it. While my cell phone company can still track me, this prevents my apps from accessing my location. I don’t need Pinterest or Facebook to know where I am to get what I want out of the app, so I don’t provide that information to them.

One of the projects Access Now has been working on is “super cookies”—when you use your cell phone, the cell companies can attach unique identifiers to your browsing as you go across multiple sites. Many companies don’t even offer opt-outs. AT&T has now stopped using super cookies, but other companies still do so.

If you don’t already, use two-step verification whenever possible to ensure that no one but you is logging onto your accounts. This process, used by Gmail, has you enter your password and a one-time numerical code texted to a phone number you provide.

Set a passcode to your phone if you haven’t already, and make it something people couldn’t easily guess—don’t use your birthday, for example. I’ve started using random numbers and passwords generated for long-defunct accounts like my middle school computer login that I memorized years ago but that can’t be linked back to me.

Amie Stepanovich of Access Now suggested using four unrelated words strung together for online account passwords—they’re even harder to hack than the usual suggestions of capital and lowercase letters, symbols and numbers.

One final precaution you can take is to encrypt your device. Apple has already started encrypting its phones by default, and Google has promised to do so. Regardless, you can turn on encryption yourself. I have a Windows phone, which does not allow for easy encryption—in fact, I can’t encrypt my SD card at all. To encrypt my phone, I need to log in to Office 365 on my laptop and change my mobile device mailbox policies to require a password, encryption, and an automatic wipe after a number of passcode fails I choose. I then log into Office 365 on my phone to sync the new settings. It’s much more straightforward for an Android—just go to settings, security, and choose “Encrypt phone.”

Off the grid? Not even close

For me – and most people, it’s not feasible to live entirely off the grid. Between my debit card, various online accounts and smartphone, I pour my personal data into company and government databases every day. The trick is to live on the grid intelligently, only providing the information that is necessary and taking steps to protect your devices from unauthorized access.

]]>
White House pushes for student data regulations http://nationalsecurityzone.medill.northwestern.edu/blog/2015/03/19/white-house-pushes-for-student-data-regulations/ Thu, 19 Mar 2015 21:32:07 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=21196 Continue reading ]]> WASHINGTON — When the educational company ConnectEDU filed for bankruptcy about a year ago, it tried to do what any business would — sell off its most valuable asset: student data.

Millions of students submitted personal information such as email addresses, birth dates and test scores to the college and career planning company.

The Federal Trade Commission eventually stopped any transactions involving the data after noting that they violated ConnectEDU’s privacy policy.

Some student educational records are protected through the Family Educational and Privacy Rights Act, or FERPA. Originally signed into law in 1974, FERPA essentially protects the records schools collect on students and gives parents certain oversight and disclosure rights.

The growing influence of technology in classrooms and in administrative data collection, though, is making FERPA out-of-date.

Teachers, students and parents now routinely submit information to educational services companies, such as ConnectEDU. FERPA does not regulate how these companies use that data. And there is no other federal law that does. The companies’ own privacy policies are the only limit to what the companies can do with the information users provide.

The concern is that ConnectEDU may not be the only education technology company that is trying to sell its data to third parties.

ConnectEDU’s databases, for example, were filled with students’ personally identifiable information including names, birthdates, email addresses and telephone numbers. The sale of that information to other companies is not regulated.

In order to make FERPA up-to-date, President Barack Obama, in conjunction with partners in the private sector, called for a legislation to establish a national standard to protect students’ data in January.

“It’s pretty straightforward,” Obama said in a speech at the Federal Trade Commission. “We’re saying the data collected on students in the classroom can be used for educational purposes — to teach our children, not to market to our children. We want to prevent companies from selling student data to third parties for purposes other than education. We want to prevent any kind of profiling about certain students.”

Dubbed the Student Digital Privacy Act, the White House’s plan is loosely based on a 2014 California law that prohibits third-party education companies from selling student information. While other states have laws regulating and increasing the transparency, regulation and collection of student data, the California law seems to be the most far-reaching.

Because FERPA doesn’t cover third-party use, some private sector leaders have taken a vow to establish clear industry standards for protecting student data through the Student Privacy Pledge.

Created by the Future of Privacy Forum and the Software and Information Industry Association in the fall of 2014, Obama mentioned the pledge as an encouraging sign for the protection of student information.

“I want to encourage every company that provides these technologies to our schools to join this effort,” Obama said. “It’s the right thing to do. And if you don’t join this effort, then we intend to make sure that those schools and those parents know you haven’t joined this effort.”

So far, 123 companies have signed the pledge, including tech and education giants such as Apple, Microsoft, Google and Houghton Mifflin Harcourt.

“There was a lack of awareness, information and understanding about what school service providers did and didn’t do with data and what the laws required and allowed,” Mark Schneiderman, senior director of education policy at SIIA, said. “Rather than waiting for public policy and public debate to play itself out, we figured, let’s just step in and make clear that the industry is supporting schools, is using data only for school purposes, not selling the data, not doing other things that there was a perception out there that maybe [companies were doing].”

The National Parent-Teacher Association and other groups support the pledge, according to Schneiderman.

“It is imperative that students’ personal informational formation is protected at all times,” the National PTA wrote in a statement.

The companies that signed the pledge are not subject to any policing body, but by signing the pledge they show consumers their commitment to student privacy, Schneiderman said.

But many notable educational technology companies, like Pearson Education, have not signed the pledge. Pearson was recently the subject of a POLITICO investigative report that revealed that the company’s use of student data was unmonitored.

According to the report, Pearson claims it does not sell the students’ data it collects.

The College Board, ACT and Common Application are often viewed as integral to the college admissions process, but are also not included in the pledge.

Instead, these education companies point consumers to their privacy policies, which can often be difficult to understand because of the legal jargon and ambiguous terms.

Some groups such as the Parent Coalition for Student Privacy think the pledge and the privacy policies aren’t enough.

“We also need strong enforcement and security mechanisms to prevent against breaches,” Leonie Haimson, one of the group’s co-chairs, said in a statement responding to Obama’s speech. “This has been a year of continuous scandalous breaches; we owe it to our children to require security provisions at least as strict as in the case of personal health information.”

Out of the 12 commitments listed in the pledge, only one deals with preventing leaks or breaches.

The signees must “maintain a comprehensive security program that is reasonably designed to protect the security, privacy, confidentiality, and integrity of student personal information against risks,” the pledge states.

Haimson said the policies are a decent start, but do not go nearly far enough in protecting educational data.

Regardless, a bill for a comprehensive national standard has yet to be introduced despite the White House’s push.

In early February, though, the White House said that it had been working closely with Republican Rep. Luke Messer of Indiana and Colorado Democrat Rep. Jared Polis to introduce a bipartisan bill to Congress.

The bill’s release is expected by the end of the month, according to Messer’s office.MINTZERPRIVACY (9) 2

]]>
Rise of the machines: domestic drones take off http://nationalsecurityzone.medill.northwestern.edu/blog/2012/04/03/rise-of-the-machines-domestic-drones-take-off/ http://nationalsecurityzone.medill.northwestern.edu/blog/2012/04/03/rise-of-the-machines-domestic-drones-take-off/#comments Tue, 03 Apr 2012 18:54:41 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=10085 Continue reading ]]>

(Defence Images/Creative Commons)

WASHINGTON – Drones – the same unmanned aircraft used for attacking the Taliban and killing Islamist terrorists – could soon come to a sky near you.

On Feb. 14, President Barack Obama signed the Federal Aviation Administration’s Modernization and Reform Act of 2012, accelerating the timetable for unmanned air vehicle use in U.S. skies. The bill greenlighted both public and private UAVs – or drones – for domestic liftoff by September 2015.

But privacy advocates are hotly protesting the law, warning that the FAA bill is the first step down a dangerous road to a surveillance society. UAVs’ high-tech cameras and sensors, they say, coupled with the current lack of regulation regarding drone use, could lead to a nation in which Big Brother watches from the sky.

The FAA previously blocked domestic UAV use due to safety concerns. But the military’s growing drone arm – they now make up a third of military aircraft – has driven improvements in the sense- and-avoid technology that helps prevent mid-air collisions.

Experts agree the bill opens the door for a commercial industry that could bring UAVs to any area from crop dusting to personal photography.

Drones’ main draw, however, is in the public sector – and therein lies their main controversy.

Advocates like Rep. Buck McKeon, R-Calif., say bringing UAVs to U.S. skies will lead to unprecedented gains in border defense, public safety and emergency response.

“Our state and local law enforcement agencies need a faster, more responsive process,” McKeon said in a statement. “Our neighborhoods deserve safer streets, and these systems can help provide that.

Source: FAA Modernization and Reform Act of 2012 (David Uberti/Medill)

Opponents of the FAA bill don’t dispute drones’ policing capabilities. But they say the same components that allow drones to stalk and strike terrorists in the Middle East and South Asia will be used to scout crime scenes, follow suspects and patrol wide areas. Thermal imaging, for example, makes it easy to look at suspects inside buildings. And high-resolution cameras let operators follow several subjects simultaneously.

The rapidly improving technology is privacy advocates’ main concern. The FAA expects as many as 30,000 UAVs – as minute as small birds and as large as the 116-foot Global Hawk  – to fly in U.S. skies in 10 years.

Keeping up with technology 

“The technology is getting cheaper and more powerful and smaller,” said Jay Stanley, a policy analyst for the American Civil Liberties Union. “It’s entirely predictable that the use of this technology will spread greatly unless there are obstacles put in its way.”

Stanley wrote a December ACLU report urging the FAA to expand its regulations to include privacy measures – not just safety guidelines. Although the air agency has repeatedly denied this responsibility, civil liberties groups insist that ensuring personal privacy helps protect individuals on the ground.

If the FAA doesn’t consider privacy safeguards in its UAV regulations, advocates want Congress to fill the gap. The main concerns are overuses by government and law enforcement agencies that include mass surveillance, video retention and see-through imaging, Stanley said.

“It’s important that these protections be put in place in the infancy of this technology so that everybody understands the ground rules of the game,” he said.

But UAV supporters think otherwise. Ben Geilom, government relations manager for the Association for Unmanned Vehicle Systems International, said current regulations for manned aircraft should extend to their unmanned counterparts.

“The aircraft itself…is new and maturing,” he said. “But the systems payload – the cameras and sensors that are on the unmanned system – are not new. In fact, they have been used by law enforcement and others on manned aircraft for decades.”

Small drones are able to hover outside of house windows to capture images and sounds, but that doesn’t mean it’s legal under current air regulations, Geilom said. Most of the privacy fears, he added, are due to unfamiliarity.

“With any new technology, there will certainly be the ability to abuse that technology,” Geilom said. “But there are also safeguards that are already in place that can serve as the framework.”

Complicating things further is that drone technology is progressing at a furious pace. The last time Congress passed a comprehensive FAA bill before February’s legislation was in 2003, when UAVs were in their infancy.

Future regulations should be limited to “broad safety parameters,” Geilom said, as more-detailed guidelines will be hard pressed to keep up with the accelerating technology.

“If unmanned aircraft can prove that it can seamlessly and safely integrate into the current manned aviation airspace…then they certainly should be able to integrate,” he said.

Medium to large-sized drones used by the U.S. military. (Congressional Research Service

Public up in the air

Despite widespread support from law enforcement agencies and the defense industry, the public remains deeply divided over domestic drone use. A February Rasmussen poll found that only 30 percent of voters approve of UAVs flying in American skies.  More than half, meanwhile, oppose it altogether.

Congress has thus far ignored any privacy concerns, including few regulations in the FAA bill and making no effort yet to add rules elsewhere.

Privacy advocates – notably the ACLU, Consumer Watchdog and the Electronic Privacy Information Center – called for added guidelines in a Feb. 24 petition to the FAA. Regulations must be added to keep up with UAV technology, they wrote, because drone use “poses an ongoing threat to every person residing in the United States.”

But law experts question the likeliness of such safeguards. Ryan Calo, director for Privacy and Robotics at Stanford University’s Center for Internet & Society, said that U.S. privacy law doesn’t hold back drone use.

The Supreme Court ruled in 1986 that no warrant was required for government agencies to take aerial photographs of a person’s backyard. And in 1989, the justices ruled that police do not need warrants to observe private property from public airspace.

“Citizens do not generally enjoy a reasonable expectation of privacy in public, nor even in the portions of their property visible from a public vantage,” Calo wrote in the Stanford Law Review. “Neither the Constitution nor common law appears to prohibit police or the media from routinely operating surveillance drones.”

Geilom and others within the UAV industry insist current rules for manned aircraft will suffice for domestic drones. Over-regulation of a potentially lucrative industry before it gets off the ground could squander opportunities for not only law enforcement, but also photographers, real estate agencies and farmers, they say.

Opponents, however, paint a much darker picture. Only a few hundred of the 19,000 law enforcement agencies in the country have a manned aircraft arm. Stanley said the ACLU fears that without further privacy protections, government organizations could overuse or mishandle such drone technology.

“That would fundamentally change the nature of our public spaces and public life and the nature of the relationship between an individual and government,” he said. “It’s not a road we should not go down.”

]]>
http://nationalsecurityzone.medill.northwestern.edu/blog/2012/04/03/rise-of-the-machines-domestic-drones-take-off/feed/ 1
Cash is king: Jim Harper and privacy in the digital age http://nationalsecurityzone.medill.northwestern.edu/blog/2012/03/14/cash-is-king-jim-harper-and-privacy-in-the-digital-age/ Wed, 14 Mar 2012 22:10:13 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=10232 Continue reading ]]>
[field name=”Harper-Video”]

WASHINGTON — After his first year of law school, Jim Harper was driving across the country with a friend when sirens suddenly started flashing in his rearview mirror.

Harper, who is now the director of information policy studies at the Cato Institute, said in a mid-February interview that what ensued after he pulled over ultimately shaped his libertarian outlook on life.

The police officer, Harper recalled, told him he detected the scent of marijuana and not only brought out a drug-sniffing dog to search the car, but also a television show crew to document the process.

“I thought to myself, ‘If a police officer can invent a smell and take well-educated, well-spoken white guys out of the car and mess with them, imagine what it’s like for people who aren’t as well-educated or from minority communities,'” Harper said. “What do you suppose life is like for them?”

Such questions have formed the ideological basis of Harper’s career, which coincided with the Internet’s rise in the mid-1990s and early 2000s.

He served as a founding member of the Department of Homeland Security’s Data Privacy and Integrity Advisory Committee in April 2005, providing input at the intersection of personal information and national defense. In his free time, he began laying the groundwork for his own consulting firm.

He joined Cato in September 2004, shuttering his private consulting firm, Policy Counsel.

Harper explained he lost his “team mentality” in transitioning from the Department of Homeland Security to the lobbying industry to the libertarian-leaning think tank.

“Once I left the Hill, I really abandoned the idea that one party is better than the other,” he said. “It’s all about policy — I work on what the right policy is.”

For libertarians, the right policy tends toward less government intervention in Americans’ day-to-day lives, Harper explained.

“It’s so important for individuals to protect themselves, rather than rely on the government to protect them,” he said.

Harper downplayed the notion that the country is at risk for another large-scale terrorist attack, saying that he is not concerned if someone in a foreign nation declares he or she is targeting the United States because it’s that person’s ability to actually carry out those intentions that matters.

“Terrorists have very little capability — happily so,” Harper said. “They got lucky on 9/11, but they won’t get lucky again.”

Instead, Harper suggested Americans should be more focused on little-known threats against their civil liberties on the home front. For example, he pointed to Internet monitoring as a real danger most Web users overlook.

He added it’s “very likely” that the National Security Agency, which operates under a confidential federal budget, has peeked at Americans’ Internet browsers on occasion.

“They might have everything that happens online — the surveillance possibilities are enormous,” Harper said. “And obviously the civil liberties consequences are enormous as well.”

His advice for privacy-minded citizens:

— Educate yourself about how both technology and government work.

–Know that every time you swipe your credit card or turn on your cell phone, your personal data is being recorded and stored somewhere.

Those pointers hark back to Harper’s first brush with libertarian principles as sirens flashed in his rearview mirror almost two decades ago.

“We need government for some things, but the power of government can be readily abused,” he said. “If I was a victim of a small abuse, the other people in my society could be living with huge abuses.”

Story by Patrick Svitek
Video by Ed Demaria, Rebecca Nelson and David Uberti

]]>
Google standing by hotly contested change in privacy policy http://nationalsecurityzone.medill.northwestern.edu/blog/2012/03/04/google-standing-by-hotly-contested-change-in-privacy-policy/ Sun, 04 Mar 2012 22:20:25 +0000 http://nationalsecurityzone.medill.northwestern.edu/site/?p=9928 Continue reading ]]> WASHINGTON — Google is maintaining that a privacy policy implemented Thursday is not the dangerous change civil liberties experts are claiming it could become.

The new approach combines the privacy policies of more than 60 Google products into a uniform code that emphasizes what the search giant considers a “more intuitive user experience.”

In an official Google blog post Thursday, Alma Whitten, the company’s director of privacy, product and engineering, wrote that the policy adjustment makes Google’s privacy controls easier to understand. Beyond that, nothing has been drastically modified, she said in the blog post.

“The new policy doesn’t change any existing privacy settings or how any personal information is shared outside of Google,” Whitten wrote. “We aren’t colleting any new or additional information about users. We won’t be selling your personal data. And we will continue to employ industry-leading security to keep your information safe.”

The company has contended a more universal policy will work to its users’ advantage in the long run. For example, under the new privacy policy, one Google product could generate traffic conditions if another Google product pinpoints the user in a certain geographic location.

Since the altered privacy policy was disclosed earlier this year, it has touched off a wave of international criticism from everyone from civil liberties watchdogs to elected officials.

In late February, 36 attorneys general signed an open letter dinging Google for not allowing users to opt out of the new privacy policy. The message, addressed to Google CEO Larry Page, addes that the privacy shift allows a user’s personal information to be shared across multiple services even if the user signs up on only one service.

The privacy policy revamping basically results in personal data being “held hostage in the Google ecosystem,” the members of the National Association of Attorneys General said in the letter.

The association’s missive came several days after the Electronic Privacy Information Center sued the Federal Trade Commission as a way of persuading it to curb Google’s impending policy change.

And on Thursday, European Union Justice Commissioner Viviane Reding declared the consolidated privacy policy goes against European law. She told the BBC that the search giant is not following transparency rules as it collects personal information across Google’s dozens of platforms, including YouTube and Blogger.

Google has greeted each challenge with the same defense: Its new, unified privacy policy follows all applicable laws and makes using its services easier for all users.

The company told a reporter for The Washington Post’s Post Tech blog that it remains “happy to discuss this approach with regulators globally.”

Thursday’s Google blog post confirmed the company’s confidence in its privacy policy revision.

“As you use our products one thing will be clear: It’s the same Google experience that you’re used to, with the same controls,” Whitten wrote.

 

]]>