Lecture Notes: Computers, Crime and
Privacy 1996
[While I do not intend to follow the same
structure in 1997, I will be covering much of the same material]
I. Class Mechanics: Seminar
- A. Everyone does a paper, due last day of class.
You are urged to submit early drafts for comments.
- B. You have the option of doing a class
presentation of the material of your paper, taking up to a full
class period (if not too many people want to do it). The
presentation may be by two people with related papers, but each
person must produce his own paper.
- C. Materials: Two books (one available in machine
readable form), cases, other readings. Students may suggest
readings, and add material and links to the web page.
- D. I am reachable by EMail, preferably at
ddfr@best.com.
- E. There is a web page.
- 1. The page is being maintained by Angela Belluomine:
Briggsp@hooked.net.
- 2. Its address is:
http://www.scu.edu/SCU/Programs/HighTechLaw/courses/ccp/ccp1.html
- 3. If you wish to add material or links, or help with
maintaining it, talk with her.
- F. Some material is also available on disk on
reserve at the library.
- G. This is a new field; I hope to learn from your
research as well as your learning from mine.
II. Three topics:
- A. Computer crime:
- 1. What it is
- 2. How it fits into the current structure of law
- a. Is information property? When someone transported
records of pirated outtakes from Elvis Presley recording
sessions, was he only violating copyright law, or was he
also transporting stolen property? His reply was that the
records belonged to him--he had paid for their production.
The court agreed--that time. Dowling v. United
States.
- b. How is it to be valued? In the Bell South case (US
v Neidorf), the question was whether a copy of a text
file on the workings of the 911 system was "stolen property
worth more than $5000." Bell South claimed the value of what
had been stolen was the cost of creating the original
file--which they still had. In Lund v Virginia, a
graduate student had "stolen" computer services from his
university; the court found that what was stolen was only
the physical output (punch cards, paper printout), to be
valued as scrap.
- c. The question of whether something should be a crime
is not the same as the question of whether something should
be done about it; alternative approaches include Civil law,
Trade Secret, Copyright. ...
- 3. How should computer crime (and other legal issues
involving the new technology) be dealt with? At least three
alternatives:
- a. By analogy. Stealing information is "like" stealing
property, for example.
- b. By new legislation.
- c. By realspace contract, property rules applied to the
computers and communication lines that support cyberspace,
etc. In other words, apply existing laws, not to cyberspace
by analogy but directly to the physical support of
cyberspace.
- B. Old privacy issues
- 1. Protection against false information being distributed.
(defamation law?)
- 2. Protection against true information being distributed.
(Privacy) Fair Credit Reporting Act. Is such protection
desirable? Is providing it practical?
- 3. Protection against government collection and
distribution of information.
- C. New privacy issue: Privacy via
encryption.
- 1. Computers as promoters of privacy
- 2. The downside of this includes privacy for activities
which not only are but ought to be criminal--murder for hire,
for example.
- 3. What other social implications does such privacy have?
III. Privacy and encryption: The need
- A. With current communication technology, most
notably EMail and cellular phones, it is physically difficult for
the user to prevent others from intercepting his
communication.
- B. Even if interception is made illegal, the law
is difficult to enforce.
- C. The solution is touse encryption to make
intercepted messages unreadable.
IV. Encryption: The technology (Detailed
information is in the FAQ from rsa.com. Some excerpts are on p. 33 of
BBB. The following is my simplified version.
- A. Public Key encryption for privacy.
- 1. The user generates two keys: call them A and B. A
message encrypted using key A cannot be decrypted using key A;
it requires key B to decrypt. Similarly, a message encrypted
with B requires A to decrypt. A cannot be calculated from B (or
B from A) in any reasonable length of time.
- 2. So you generate two keys, make one public, keep the
other secret. Someone who wants to send you a message encrypts
it with your public key; only you can read it (decrypting with
your private key).
- B. Digital Signature. To prove you wrote
something, encrypt it with your private key. The fact that your
public key decrypts it (and generates a message, not garbage)
shows that it was encrypted with your private key, which only you
have.
- C. Digital Cash
- 1. Desideratum: Be able tomake payments that neither your
bank nor outside snoops can trace.
- 2. My scheme:
- a. Send the bank a dollar and a long number. Bank agrees
to pay the dollar to whomever comes in with the number, and
to change the number in response to a request accompanied
with the current number.
- b. When you want to buy something from someone with a
dollar, you give him the number. He sends it to the bank,
along with a new number it is to be changed to (which only
he will know, making the dollar effectively his) and a
second number to identify the transaction.
- c. The bank posts, in some public place, the fact that
the transaction associated with the second number has gone
through. The recipient now knows that he has the dollar
(i.e. the number you gave him really represented an unspent
dollar), so he gives you the goods.
- d. All this takes a second or two; the computers do the
work.
- 3. For Chaum's more elaborate version, see www.digicash.com
on the web.
- D. Untraceable communications: You don't want
people to know whom you are communicating with. Use an anonymous
remailer.
- 1. Send the remailer your message (encrypted with the
recipient's public key)+the recipient's address, the whole
thing encrypted with the remailer's public key.
- 2. Remailer decrypts the outer layer of encryption with his
private key, revealing the address and the (still encrypted)
message. He sends it on.
- 3. If you are afraid the anonymous remailer might be a
snoop, relay your message through ten different remailers en
route to its destination.
- E. Authentication problems: How do you know my
public key is really my public key, and not a public key someone
has fooled you into thinking is mine, in order to intercept
communications to me and forge communications from me?
- 1. Centralized solution--they are all in the phone book,
each of us checks his own and spreads the word if it is bogus.
- 2. Decentralized solution, a la Phil Zimmerman (PGP2). Your
computer keeps track of who you got what public key from, and
tells you how sure it is that the key is genuine according to
the number and reliability of the sources.
V. The Supporting Technology: power, bandwidth,
Virtual Reality
- A. With high speed computers and networks capable
of carrying a lot of stuff (i.e. the near future) ...
- B. A large part of all human interaction can be
done over the net via virtual reality.
- C. Now encryption issues become central privacy
issues.
VI. Implications of strong privacy
- A. Transactions, conversations, etc. are
technologically private. Interception is impractical, whether or
not illegal.
- B. A firm can combine anonymity with brand name
reputation! The firm's identity is defined by its public key:
anyone who can read messages encrypted with that key and send
message decryptable with that key is the firm.
- C. Obvious advantage: freedom of speech,
assembly, etc. are techno- secure.
- D. advantage/disadvantage: Information sales and
purchases are unobservable, thus untaxable.
- E. Disadvantage: Criminal firms with brand
name--murder inc. Pirate publishers.
- F. Defense against the undesirable consequences
can be by prohibition (of consequences or of strong privacy and
its supporting technologies) or by using the technology for
defense. Examples of the latter approach include:
- 1. Contract substitutes for intellectual property? Maybe.
- 2. True names: If they don't know who you really are, they
can't hire a contract killer to kill you.
- G. Private Law.
- 1. Robert Nozick, in part 3 of Anarchy, State and
Utopia, described an ideal society made up of many
communities with varying rules, free movement among them.
- 2. Consider an EMail mailing list. It is a proprietary
community, controlled by whoever controls the list of names. He
can make and efnorce rules of conduct--with the sole sanction
of expulsion.
- 3. Imagine the same thing in virtual reality. You now have
a real properietary community with private law. Lots of these
give you a competitive market in which anyone can set up his
own. Crypto-Anarchy.
8/23/95
I. Can strong privacy be stopped, and if so
how?
- A. Prohibit encryption?
- 1. Hard to enforce (you can hide messages in anything with
a lot of data--pictures, for example, as the least significant
bit of one of the colors)
- 2. Very costly in lost opportunities. Without encryption on
line banking, firms exchanging confidential information
electronically between physically separated offices, etc.
become hard and risky.
- B. You can slow the development of encryption by
making it less profitable.
- 1. That seems the obvious explanation of U.S. export
controls over encryption.
- 2. You cannot keep foreign powers from getting programs
that are for sale in the U.S., so the controls cannot serve
their putative purpose of keeping "arms" out of the hands of
potential enemies, but ...
- 3. If firms that include encryption in their software must
produce and maintain a separate version abroad, that is a cost
which makes including encryption less attractive, thus slows
the spread of encryption to the general public.
- C. Nationalize Encryption: The Clipper chip
initiative.
- 1. The idea.
- a. The government designs and has produced (by a private
contractor) a chip for encrypting voice communications,
intended to be included in encryption products.
- b. The chip must be provided with a key for encrypting a
communication session; the key exchange protocol is left to
whoever is building the product.
- c. Each chip has its own built-in key, which it uses to
encrypt the session key. That plus the chip's serial number
is encrypted using the law enforcement key (which all cops
have, and soon everyone else--too many people to keep a
secret). That chunk of information (the "law enforcement
block") is then sent along with the encrypted ocnversation.
- d. Police tap a phone, record the communication. They
decrypt the law enforcement block, get the serial number of
the chip.
- e. The chip's key, in two pieces, is held by two escrow
agencies, probably government departments, although one
might be private. The police take the serial number and a
court order to the escrow agencies, get the key.
- f. Using the key, they decrypt the encrypted session
key, use that to decrypt the conversation. Since they now
have the chip key, they can decrypt the session key of any
future conversations as well and listen to them.
- 2. Problems:
- a. Can it work? How do you prevent people from
pre-encrypting their conversation, feeding that into the
Clipper chip, so that when the police decrypt they get
gibberish?
- b. Is it secure? Why is the algorithm kept secret?
- c. Do we trust the government? Forever?
- d. Do other governments trust ours?
- e. What is the point of the clipper chip if it is not
compulsory? Why will the terrorists/drug lords/child
pornographers etc. choose to use a form of encryption
designed to be read by a police officer with a court order?
- f. One answer is that the Clipper chip is supposed to
become a standard, thus the only convenient form of
encryption--what good does it do to use your own encryption
if nobody else does? That suggests that it is aimed against
the unsophisticated criminals. After all, people get caught
by wire taps now, even though they know that wire taps
happen.
- g. A different answer is that the government has
anticipated strong privacy and is trying to prevent it. If
Clipper encryption is the only encryption available to the
general public, then firms selling pirated software,
assassination, etc. will have no safe way of communicating
with the mass market.
- D. What about the alternative of private forms of
encryption, with required escrow of keys?
- 8/28/95
I. About encryption:
- A. It involves converting plaintext (the message)
to ciphertext (encrypting the message), sending the ciphertext,
then having the recipient convert it back to plaintext
(decrypt).
- B. The algorithm is the process for
encrypting.
- 1. Typically, algorithm is public (lots of people using
it).
- 2. Also makes it possible to test algorithm in order to see
if you can crack it
- 3. And analyze it to figure out whether other people will
be able to crack it.
- C. The key is known only to the people encrypting
and decrypting that particular message.
- D. One simple approach is substituting one letter
for another. A different approach is scrambling the order of the
letters.
- E. Given enough time and skill, and a large
enough message, such a scheme can be decrypted without the key
(the cipher can be broken).
- F. With the exception of a one time pad.
- 1. That is a simple substitution cipher in which the key is
as long as the message and the substitution rule changes at
each letter. For instance:
- a. The key starts 6 9 24 17 ...
- b. The first letter of the message is "H." Substitute
"N" because it is 6 letters later. The next letter is "E."
Substitute "N" because it is 9 letters later. ....
- c. Since each rule ("six letters later," "nine letters
later") is used for only one letter, there is no way of
figuring out the rule from one part of the message and using
it to decrypt another part.
- 2. With modern technology, one could create a one time pad
in the form of a CDROM, make two copies, give one to your
correspondent, and use it for the rest of your lives (assuming
only text messages, adding up to a total of no more than 800
Megabytes or so).
II. Key distribution and management
problems.
- A. How do you get the key to the person who will
receive your message, while making sure that nobody else sees
it?
- B. How are keys stored and who has access to
which key?
III. Public key as a solution to both
problems.
- A. Distribution: don't need secrecy, since the
public key can be distributed to everyone, and the private key
need not be distributed to anyone other than its original
owner.
- B. Non-disclaimable
- 1. Your signature proves that the message was sent by
someone with access to your private key.
- 2. You can claim that your private key got out somehow
- 3. But that means conceding your fault, which makes it more
practical for you to be liable (for the money that the message
took out of your bank account to stay out, for example).
- C. The RSA faq excerpt explains the mathematics
of its version of public key encryption in half a page--for the
mathematically sophisticated.
- 1. The one-way nature of the encryption (a key
cannot decrypt the message it encrypted) comes from the fact that
it is easy to calculate the number n=pq, where p and q are very
large primes, but is hard to find p and q starting with n (i.e. to
factor n).
- 2. So RSA encryption could be broken if someone
found a fast way of finding a number's prime factors. Nobody has
yet, but neither has anyone proven that no such way exists.
- 3. Suggestive anecdote: Fermat, a famous French
mathematician, once received a letter from a correspondent asking
if a certain large number was prime. Fermat wrote back that it was
not, given its prime factors. We do not know how he could have
done so so fast (computers were very far in the future). So maybe
he knew something we don't. (All this is from memory, and I do not
vouch for its accuracy).
- D. Speed issue: Public key encryption is much
slower than some forms of private key encryption. So what you
really do is to generate a session key for your message, send that
encrypted with the recipient's public key, and send the message
encrypted with the session key using a fast private key
algorithm.
III. A Digital Signature serves three
functions--identify sender, prove sender, untampered text.
- A. One simple way is to encrypt the message using
the sender's private key, then encrypt that plus the sender's name
with the recipient's public key.
- B. One way of proving that a message has not been
tampered with is to use a hash function--a procedure for
calculating a number from the message. If the number (sent
encrypted) does not correspond to the number calculated from the
message you got (not necessarily encrypted), you know the message
has been changed, deliberately or otherwise.
- C. Note the relevance of the ability to "prove"
who sent a message to legal issues involving
obscenty/indecency/and defamation.
IV. As computers get faster, they can encrypt
faster, decrypt faster, and break encryption faster.
- A. So you use longer and longer keys--which take
longer to encrypt and decrypt but are harder to break.
- B. It appears to be the case that the time it
takes to break the encryption increases more, as the key gets
longer, than the time to encrypt or decrypt. If so, encryption can
get stronger as computers get faster.
8/30/95
I. Non-cryptographic attacks: Consider a simple
password cracking problem. You are a hacker who had dialed into a
computer and is trying to get privileges on it--which requires giving
it a password it recognizes as associated with a legitimate
user.
- A. Brute force approach--try all
passwords.
- B. If passwords are invented by their users, try
...
- 1. User's name. Common names.
- 2. Example from the instructions provided to the users
(i.e. "password")
- 3. This does not work in most encryption contexts because
the key is generated by a computer, and is a big number, not a
word.
- 4. DEC used to send their computers out with three sample
accounts, each with a standard password. The buyer was supposed
to disable those when he set up his own accounts. Some did
not--so a hacker could get on via a sample account--preferably
one designed for a system administrator, with privileges to
match.
- C. Find the password by
- 1. Watching the user type it (direct or via computer) .
- 2. Searching the trash can
- 3. Searching a user's wallet, apartment, etc.
- D. Piggyback.
- 1. Physical version, where you need a key, card, etc. to
enter. Wait by the door with your arms full of boxes of punch
cards, come in after a legitimate user.
- 2. If a user does not logout correctly, you just "continue"
his session.
- 3. You modify things to make sure the user does not logout
correctly--but thinks he does.
- E. Social Engineering: Get a password by asking
for it.
- 1. "I am calling from outside and forgot ..."
- 2. I am from the computer company, testing the machine, and
I need your password.
- 3. Pretend to be a superior official in the company.
- 4. Bribe someone who knows, such as the secretary who
actually does the computer stuff for her boss.
- F. Think about equivalent approach for breaking
encryption.
II. Why does the government care about
cryptography?
- A. NSA.
- 1. Foreign govts might learn better codes, which would be
harder to crack.
- 2. they might realize that we can crack their codes--as the
German and Japanese governbments did not realize in WWII.
- 3. Might learn to crack our codes.
- 4. But Russians are good mathematicians... . Letting them
know what we know as well as what they know might make things a
little easier for it, but it seems more likely that secrecy is
intended to keep information from smaller countries and private
citizens.
- B. FBI: Interested in being able to tap phones.
It tried to get such a bill in 1992, succeeded in 1994.
- 1. Digital Telephony may be mechanically harder to tap.
- 2. Cellular phone calls, especially on a system designed to
provide some privacy, may be harder to identify--you can
intercept a message, but how do you find the conversation
corresponding to a particular phone?
- 3. Encrypted phones are useful when the conversation is
over the air, but make wiretaps harder
- 4. All three of these can be dealt with by the
provider--but what if the user provides his own encryption?
- 5. Old Phone Phreak version of the problem. People who knew
the phone system well could route their calls by indirect
routes, making it very hard to trace them.
- 5. How much is the ability to tap phones worth?
- C. Have NSA, FBI etc. thought about strong
privacy issues?
III. The question of standards:
- A. Arises twice in this discussion
- 1. One reason the government chose DSS not RSA was
apparently because RSA can also be used for encryption. By
encouraging its use for one purpose, you make it easier for it
to become a standard and be used for the other purpose. If I
already have your public key in order to verify your messages,
why not encrypt messages to you using it?
- 2. The government may plan to "enforce" Clipper by making
it a standard, rather than by making it mandatory. Since you
have to use it for secure conversations with the government,
you might as well use it for other conversations as well.
- B. What is a standard?
- C. How stable is a suboptimal standard?
- 1. Examples:
- a. English spelling seems to be very stable.
- b. It used to be claimed that the Wankel ("rotary")
engine was much better than the reciprocating engine (what
we use in all cars except the Mazda RX7), but was not used
because everyone had sunk money into optimizing the
reciprocating. The evidence against is that NSU and Mazda
both put a lot of effort into producing a commercially
viable rotary engine, without much success.
- c. It is claimed that the Qwerty keyboard is much
inferior to the Dvorak keyboard, but remains dominant
because it became a standard, more or less by accident, in
the 19th century. For the evidence that that story is almost
entirely mythical, see the article "The Fable of the Keys"
in Journal of Law and Economics some years ago.
- d. b and c suggest the difficulty of judging such
claims--given that the "superior" alternative is not in use
to be evaluated.
- 2. How might an inefficient standard get established?
- 3. How high are the costs of switching to the superior
standard?
- a. In the QWERTY/Dvorak case, the costs to some users
(typists using many machines) were high, but to others (authors
using their own machines, a firm that trained and retained its
own typists) would be low. The fact that those inthe latter
class did not switch over is evidence against the claim that
Dvorak was much superior.
- D. standards also appear as an issue in
discussions of intellectual property in computer law:
- 1. The claim that something is a standard is usually
assumed to be an argument against protecting it, since we want
compatibility, but ...
- 2. One might get compatibility by having a standard owned
and licensed out.
- 3. The fact that something is a standard might be an
argument for protection; if we want good standards, we should
give their inventors protection so they can make money by
inventing them. One can argue that the superiority of the Mac
interface is the result of the fact that Apple, through its
copyrights on the Mac ROMs, had de facto ownership of the
interface, thus an incentive to design it well and keep
developing it.
I. Review:
- A. Another suggested paper topic. What are the
legal (and reputational) risks to a university from giving its
students access the internet, and how can they be controlled--with
special attention to current policies at this university.
- B. Standards:
- 1. Example of electric plug and socket. We keep using the
first design to come into common use, not because it is better
than alternatives, but because life is easier if all plugs fit
all sockets.
- 2. This suggests one model of a standard: Easy to devise,
all alternatives about equally good, the first one that appears
survives. That sounds like a situation where intellectual
property protection is neither necessary nor desireable, since
it simply raises the costs of standardization (new producers
have to negotiate a licensing agreement with the owner of the
standard).
- 3. Consider an alternative model: A complicated standard,
where getting right in the beginning and then modifying it to
allow for new developments is important. That provides an
argument for protection, to give someone an incentive to come
up with a good standard and maintain it.
- 4. A common alternative in such a situation is a standards
committee.
- B. Net for Novices
- A. What is it? A standard/protocol, not an organization.
There are committees to set standards, but they have no legal
authority.
- B. EMail: How it works. Your computer sends the message,
possibly in several packets, to another computer, which sends
it ... until it reaches the destination.
- C. FTP: A system for making files on one machine available
to anyone else on the internet (anonymous ftp) or anyone else
with the proper password.
- D. Usenet News: Thousands of bulletin boards, each on a
different topic, with no central site maintaining them. When
you post, your machine has the post immediately and passes it
on to other machines it communicates with.
- 1. Some newsreaders provide filtering, making it
possible for you to tell your machine that you do not want
to read any more posts on a particular topic or by a
particular poster.
- 2. The structure of News raises problems for enforcing
rules against obscenity, indecency, and defamation. Except
for moderated groups, there is no filtering mechanism for
what gets posted. And punishing after the fact is made
difficult by the fact that a sophisticated user can make a
post appear to originate somewhere other than his actual
site.
- E. Gopher: Lots of publicly accessable servers with
information about what is where on the net.
- F. The Web: A geographically distributed hypertext
document. Anyone can put up a web page and link to any other
page on the net.
II. On to Clipper
- A. The problem from the government's standpoint:
widespread availability of encryption.
- 1. We want it, so individuals and firms can protect
themselves, but ...
- 2. Law enforcement prefers to be able to read our mail.
- B. Solution: provide it, but make it vulnerable
to anyone with a court order.
- C. Mechanics.
- 1. Each chip has a serial number and a key (programmed in
secure circumstances, escrowed in two pieces).
- 2. Two phones negotiate a session key.
- 3. Each chip transmits the conversation encrypted with the
session key, plus a law enforcement block consisting of the
serial number encrypted with the Law Enforcement Key plus the
session key encrypted with the chip key.
- 4. Law enforcement taps (with court order), decrypts the
law enforcement block, gets serial number, goes to escrow
agent, gets chip key, decrypts session key, decrypts
conversation.
- D. Why escrow system might not guarantee
access:
- 1. Non-clipper chips could be used instead of the Clipper,
or
- 2. They could be used to pre-encrypt the message, then feed
it into the Clipper, or ...
- 3. A chip or software could be used to remove, or better to
alter, the law enforcement block as it is being sent. It is
possible to check that a "possibly correct" law enforcement
block is being sent without having the chip's key, so you would
have to produce a block that would pass that test.
- 4. A bootleg clipper chip or the equivalent in software
could be used to send what would look like a clipper encrypted
conversation, but with a bogus serial number. This requires
either corruption of the single producer of the chip or someone
figuring out the algorithm.
- 5. One might make all of these illegal, but that raises
enforcement problems.
- E. Why it might not be private enough.
- 1. The algorithm is secret, and may have a back
door--either known to the NSA or yet to be discovered.
- 2. Can we trust escrow agencies, courts?
- 3. Once a chip is blown, it is blown forever. After the end
of the wire tap the subject is supposed to be notified. But a
law enforcement agent might be able to shift that chip to
another phone for an illicit tap.
- 4. What about bogus chips with known keys, produced
(illicitly) by the Mycotronics midnight shift?
- F. Alternatives:
- 1. Private escrowed encryption; you can sell an encryption
product, provided you guarantee that the keys are escrowed
somewhere with an agency that will accept a court order.
- 2. Perhaps a court could compel production of the key. That
might work for encrypted data on my hard disk, since my claim
to have lost the key would not be believable. But for encrypted
conversations, both parties simply throw away the key at the
end of the conversation.
- 3. Would such a requirement raise self-incrimination
issues?
- G. What was the government trying to do?
- 1. Perhaps they realized that sufficiently sophisticated
criminals/spies, etc. could always get around the Clipper, but
they were aiming it at the mass of unsophisticated criminals,
or ...
- 2. Perhaps they anticipated the strong privacy problems we
have discussed, and were aiming Clipper at ordinary citizens
who might wish to buy illegal goods or services (assassination,
pirated software) from criminal firms.
- 3. Perhaps they intended to get Clipper established as a
standard during the window of opportunity before adequate
private alternatives were availalbe.
- 4. If so, they have lost to PGP for text and may be about
to lose to Nautilus and PGPhone for voice.
- a. Nautilus has been released; it is a public domain
encryption program using a Dos computer and a modem. Like
Clipper it does not provide key exchange; users must somehow
exchange the key in advance. It is single duplex--only one
person can talk at a time.
- b. But given the current rate of progress, it may well
be an adequate Clipper alternative within a year.
- H. Current status of the proposal(s)
- 1. Clipper is being offered only for encrypting phone
conversations--the equivalent for other uses is still under
discussion. It is not yet to be mandatory.
- 2. Export controls are supposed to be eased to permit the
export of strong encryption (up to 64 bit keys), but only if
some sort of escrow system is provided.
- 3. As evidence of the practical difficulties of export
controlls ... . Just after Nautilus was released (and made
available on machines that cannot be accessed from abroad),
someone from Australia posted on a newsgroup, asking for
someone in the U.S. to get Nautilus and send it, via anonymous
remailer, to his favorite server in Australia. In his next post
he apologized for having made the request before checking the
server--Nautilus was already there.
9/11/95
I. Odds and Ends
- A. Santa Clara access policy.
- 1. What it apparently is
- a. Computers to which students have open access are
restricted to the SCU local net, so ...
- b. Students can use them to access the Internet only
through their accounts on the Vax.
- c. This will not apply to computers to which only one
student has access (i.e. when the dorms are wired)
- d. And does not apply now to parts of the University,
such as the Law School, which have arranged to be exempted
fromthepolicy (i.e. the Law School library computers, which
can use Netscape to access the net directly).
- e. The justification for the policy is that it allows us
to know who is responsible if a student does something on
the net.
- 2. How useful as a means of control is it to know whose
account something was done from?
- a. Not much use if users are not aware that they are
doing bad things, thus not deterred.
- b. If they are aware, the obvious solution is to use
someone else's account.
- c. Either watching someone type in his password, or
finding it written ...
- d. In which case the University could argue that the
owner of the account is to blame for being careless with his
password, even though he did not do whatever bad thing was
done from his account...
- e. But there are ways of getting access to someone's
account that are not his fault, such as ...
- f. Installing Last Resort of some other program that
logs keystrokes to disk on a publicly accessible computer,
coming back a day later, copying the keystroke log to your
floppy, and harvesting passwords, or ...
- g. Calling in to your account and being erroneously
connected to someone else's, as has happened (information
contributed by students during the discussion).
- B. White paper proposal:
- 1. Make software or hardware that is primarily intended to
break encryption in order to violate copyright rights illegal
- a. This amounts to reversing the result in Vault v
Quaid
- b. And is a weaker standard than currently required to
prove contributory infringement in other contexts (i.e.
Betamax).
- 1. It is proposed as a way of protecting intellectual
property.
- a. You can sell a disk of encrypted fonts, and let the
buyer pay to have you decrypt the ones he wants.
- b. And even if an expert could break your encryption,
very few of your customers can,
- c. Since they have to do it themselves, instead of using
a program or machine or service provided by an expert, that
now being illegal.
- 2. It is also a way of protecting weak encryption
- a. A snoop who is an expert cryptographer could break
your encryption, but ...
- b. Unless the information is very valuable, the snoops
won't be expert cryptographers
- c. And software that substitutes for being an expert may
be hard to find
- d. Since an important reason for writing such software
is to crack encryption scemes used to protect copyrighted
material.
- 3. Which might keep people from bothering with strong
encryption--and if everyone uses only weak encryption, it is a
lot easier for NSA to read everyone's mail.
- 4. There remains the problem of identifying things mainly
intended to ...
- 5. And the problem of preventing illegal public domain
software tools from spreading across the net. PGP spread
despite the fact that, at least initially, it seems to have
been in violation of RSA patents.
The proposed section 1201 would
provide:
No person shall import, manufacture or distribute any
device, product, or component incorporated into a device or product,
or offer or perform any service, the primary purpose or effect of
which is to avoid, bypass, remove, deactivate, or otherwise
circumvent, without the authority of the copyright owner or the law,
any process, treatment, mechanism or system which prevents or
inhibits the violation of any of the exclusive rights of the
copyright owner under section 106.
- C. Business Software Alliance testimony at the
Sept 6th NIST meeting:
- 1. Perhaps NSA wants to require key escrow abroad in order
to get it at home.
- a. If firms build in key escrow to their products in
order to be allowed to export products with 64 bit keys
- b. It is then a small step to require them to activate
the key escrow features for domestic sales as well.
- 2. There is commercial demand for key escrow for archives
but not for communications.
- 3. If the government is going to require key escrow for
export, there is no reason to also limit the key length. It
should be unlimited, to allow U.S. software firms to compete
with foreign firms--which face no limit on key size.
- 4. The government promised in 1992 that permitted key size
would be increased over time to take account of increased
computer power--but it did not happen.
- 5. What we need is market driven escrow (voluntary, for
archived information, to make sure you can get it if the key is
lost)--without it (and long keys) U.S. industry is out in the
cold.
- D. EPIC used the freedom of information act to
get FBI files which made the argument that, in order for the
Clipper proposal to work, it is necessary to prohibit private
encryption.
- E. Netscape Cracking story: Someone used a bunch
of computers to crack a message encrypted with the 40 bit key that
Netscape used for copies used abroad.
- 1. They use 40 bits because that is the limit of what it is
permitted to export.
- 2. The version used here uses 128 bits.
- 3. And Netscape regards the incident as evidence that the
ITAR restrictions should be lifted.
- F. Proposed new export policy as of Sept 6-7
mtg:
- 1. It should be legal to export encryption products with 64
bit keys, provided
- a. They provide for mandatory escrow
- b. And the escrow is with agencies approved by the U.S.
government.
- 2. The product should have the key escrowed during
manufacture or be inoperable until the key is escrowed.
- 3. And should not decrypt messages produced by non-escrowed
products.
- E. There is said to be a bogus Dole home page on
the Internet. Suppose it is true. How does a firm or individual
prevent other people from damaging his reputation by pretending to
be him?
II. How important is wiretapping? Freeh's
statement
- A. "Terrorist crime will get worse ..."
- 1. How many people have been killed so far by terrorists in
the U.S.?
- 2. How many such crimes could have been prevented by
wiretapping if, but only if, something like the digital
telephony bill was in force?
- a. Oklahoma City, if the current version of what
happened is true, could not have been--there was no large
scale conspiracy to tap.
- b. World Trade Center? Maybe--but you would have to have
a spy in the relevant circles to alert you to whom to tap,
and once you have the spy tapping might be unnecessary.
- c. We do not know the answer to the question, the FBI
probably does not know. How plausible is their implicit
opinion (that there are a substantial number of such
crimes)?
- B. played a role in convicting tens of thousands
of felons over a decade--out of how many? Data from the
Statistical Abstract:
- 1987: 10 million people arrested, 2 million plus for
serious crimes. I don't have figures for convictions for the
whole country, but for U.S. District courts alone it was about
44,000.
- Convictions based on information received from wiretaps:
506
- Offenses specified of which those people were convicted:
homicide and assault: 18. Drugs and gambling: 514
- C. "The Administration has said time and again
that it will not force key escrow on ..." (Freeh)
From the initial Clipper Press Release:
Q: If the Administration were unable to find a
technological solution like the one proposed, would the
Administration be willing to use legal remedies to restrict
access to more powerful encryption devices?
A: This is a fundamental policy question which
will be considered during the broad policy review.
9/13/95
I. Non-clipper escrow solutions:
- A. Mandatory private software with fair ...
.
- 1. You cannot force software to have its keys escrowed, but
...
- 2. You can force public public keys (!) to have escrowed
private keys that match them
- 3. Provided that the public keys are public enough so that
law enforcement knows about them.
- 4. And if they are not, how does your correspondent find
your public key?
- B. What about software that provides voluntary
key escrow with an escrow agent of your choice as an
option?
- 1. Easy sell. Many purchasers would like the security of
being able to get their key back if it was somehow lost.
- a. What about spoof requests? How sure is the escrxow
agency that the request is really coming from the person
whose key it is?
- b. When you escrow, you specify just who has authority
to ask for the key. Board of Directors? Can they be trusted
not to be fooled into violating security, retrieving the key
and letting it fall into the wrong hands?
- 2. Mainly useful for records
- 3. From the standpoint of law enforcement, this gives them
part of what they want--if the information has been incrypted
using a key escrowed somewhere, they can go to the escrow agent
with a court order and demand the key, but ...
- 4. What if the escrow agent is the equivalent of a Swiss
Bank--with a reputation to protect for being very reluctant to
release keys, even to law enforcement?
- 5. One could use ITAR to get private escrow going by
allowing the esport only of software that requires escrow...
The NIST proposal. But who licenses the escrow centers?
II. Function of Clipper
- A. A pair of users can easily defeat it if it is
not mandatory by using their own encryption instead.
- B. A pair of users can defeat it even if it is
mandatory by pre-incrypting (thus not being obviously
non-Clipper)--until they are tapped, and charged with illegal
encryption.
- C. So it is chiefly designed to protect against a
single "Rogue" user, a term which really covers three
cases:
- 1. Ordinary conversation, one side knows it is secret,
other believes that it is not.
- 2. I call you and guarantee privacy (outgoing)
- 3. You call me and I guarantee privacy (incoming)
- D. Note that when they talk about a user
"knowing" the conversation is or is not secret (i.e. cannot or can
be read by a law enforcement official with a court order), they
are mostly talking about what the Clipper Chip in his phone
"knows."
- 1. If the Clipper Chip in my phone realizes the other Chip
is not kosher (i.e. is deliberately modifying the LEAF so that
the conversation cannot be decrypted), it will refuse to
decrypt the incoming message.
- 2. So I have to not only approve of keeping the
conversation secret, I also have to have a bogus Clipper Chip
on my side to do it with.
- 3. So the real objective (the most they can reasonably hope
to achieve) is to make sure that it takes two bogus Clipper
Chips to hold an untappable conversation.
- 4. The initial design apparently failed to achieve that--as
demonstrated by someone at Bell Labs. But it can be fixed.
- D. To make sure the other device knows if Clipper
is being defeated, the Law Enforcement block contains a checksum
of the session key encrypted only with the family key.
- a. This only works if the family key is secret! Otherwise a
rogue Clipper chip (or equivalent in software) can be built
that gives the right checksum but gives the wrong session key
(which is encrypted with the chip's key, which the other chip
does not know and so cannot check on).
- b. To keep the family key secret, it is built into a
"Decrypt device" --a sort of reverse Clipper Chip. Like the
Clipper Chip, it is presumably protected against reverse
engineering. Nobody need know the family key--if you want to
use it to decrypt the LEAF of intercepted communications, you
check out a decrypt device.
- c. So the Law enforcement block is secret--but how long
until a decrypt device is stolen? That still does not let you
build rogue Clipper chips (since you can't get the family key
out of the device), but it does let you do traffic analysis
(who is talking to whom) without a warrant.
- d. One advantage to hardware is that while it can be
stolen, it cannot be (easily) copied. So if one decrypt device
is stolen, one criminal has it. If the family key got out,
pretty soon everyone would have it.
III. Hardware v Software encryption--can we do the
equivalent of Clipper in software?
- A. We need to use (tamper-proof) hardware if we
use a classified algorithm
- 1. since software could be disassembled to deduce the
algorithm.
- 2. But we don't need to use a classified algorithm--there
are unclassified ones that will do just fine.
- B. Software can be modified to defeat the
escrowed encryption
- 1. But you can do the equivalent for hardware by
pre-incryption
- 2. All you need to equal Clipper is a system that requires
modified programs on both ends to defeat the escrow system.
- 3. Or in other words one that, like Clipper (once they fix
it), prevents rogue communicators.
- C. Problem: If you use software, how do you keep
the Family Key secret?
- D. Solution--use a public Family key for
encryption, private for decryption (by law enforcement agent). So
only the public key is built into the program.
- E. The receiver can construct the LEAF and check
it is right! Since we no longer have keys associated with chips,
the LEAF constructed by either key is the same. We encrypt the
session key with the escrow agency's public key (fancier version
for multiple agencies) and include it in the LEAF instead of
escrowing chip keys with the escrow agency.
- F. Law enforcement takes the LEAF to the escrow
agency, gets back session key.
- G. All of this is my somewhat simplified version
of what the article proposes.
- H. The basic idea is that you can use public key
encryption plus software to accomplish the same objective that the
clipper chip accomplishes with private key encryption and
hardware.
IV. Digital Telephony bill
- A. Does not cover encryption, unless it is
provided by the phone company
- B. Rhetoric and general problems.
- 1. Strong on: "staggering" costs, "Untold victims,"
"numerous lives," "Dismanteling organized crime groups (WW1
fighters), can assure you ... "disasterour results," "billions
of dollars," numerous deaths, ravaged lives,"
- C. Cost/benefit info.Would not pass peer
review.
- 1. All numbers are from the FBI without auditing.
- 2. Fulton fish market has no legal monopoly--$1/lb a large
number. That claim (that organized crime raised the cost of
fish in the U.S. by $1/lb) is sufficiently implausible to cast
some doubt on other claims.
V. Cost/benefit calculations:
- A. What is the cost of crime?
- 1. Zero--just a transfer?
- 2. But the opportunity to transfer provides an incentive to
spend resources tranferring, and that expenditure is a net
waste...
- 3. As long as you can steal $100 at a cost of $50, more
people enter the stealing business, driving down the return
(everything worth stealing is already stolen), so you reach
equilibrium only when it costs about $100 to steal $100
(counting all costs, including risk of being caught and
punished).
- 4. So to first approximation, the cost of crime is equal to
the amount stolen.
- 5. Second aproximation:
- a. The cost is less, because some people are especially
good at stealing, and can steal $100 at a cost of $80 even
though doing so costs the marginal thief $100.
- b. The cost is more, because we have to include the cost
of precautions that people take to defend themselves against
being stolen from.
- 6. The argument is not limited to theft. It also applies to
legal ways of transferring--lobbying for legislation that
benefits you at someone else's expense, for example.
- 7. But this analysis applies to crimes such as theft, which
are not what wiretaps mostly about.
- B. Victimless crimes?
- 1. We don't even know the sign of their cost.
- 2. On the fact of it, such crimes produce a benefit--which
is why the "victims" (drug users, gamblers, customers of
prostitutes) choose to participate.
- 3. To argue for a cost, you need to claim that someone else
somewhere is being hurt (or the customers are being hurt but
don't know it), and how can you estimate the size of that
effect.
- C. Tax evasion?
- 1. If government expenditure is fixed, so that my evasion
simply raises your taxes, then the analysis is like the
analysis of theft, but ...
- 2. If expenditures depend on how much government can
collect, then my tax evasion reduces expenditures.
- 3. If you think a dollar of government expenditure buy
things more valuable than a dollar of private expenditures, the
result is more costly than an equal amount of theft.
- 4. If you think an extra dollar of government expenditure
does net damage, then tax evasion produces a net benefit.
9/18/95
I. Readings from Chapter 6 of BBB
- A. Diffie:
- 1. The FBI et. al. claim that Clipper and the digital
telephony bill merely maintain the status quo between the
citizen and law enforcement.
- 2. But it really represents an attempt to lock in
technological changes that favor the state.
- a. When most interaction was face-to-face, it was hard
to "tap" the conversations of people who were making an
effort to keep them private.
- b. The constitution did not provide a right of private
conversation, presumably because it never occurred to the
founders that it could be infringed.
- 3. When it became possible to tap phones (and telegraphs)
without notifying the victim of the "search," they did not
"maintain the status quo" by forbidding it.
- 4. This is not entirely fair--the law did restrict the
right of the police to use the new technology, and did require
that the victim of a wiretap eventually be notified.
- B. Froomkin: Would mandatory key escrow (i.e. all
unescrowed encryption is illegal) be constitutional?
- 1. First amendment--that depends on how important the court
thinks the objective is relative to the chilling effect on
those with unpopular opinions of being unable to have secure
privacy for their conversations.
- 2. Fourth amendment: Should the escrow (the original
deposit of the chip key in a database) require a warrant?
- a. Froomkin thinks there is a presumption that it would,
but ...
- b. Administrative searches, such as mandatory drug
testing designed to deter crimes rather than reveal them,
are permitted; this seems to fit that pattern.
- c. Nothing similar has yet been permitted in the home,
so perhaps this would be a good argument against requiring
escrow for non-commercial speech.
- d. Obvious counter-argument: They are not searching your
house when they escrow your key--they are getting it from
the producer of the chip, who turns it over willingly.
- 3. Fifth amendment: Does any part of escrowed encryption
require self-incrimination?
- a. The initial disclosure of a key is not incriminating,
since you haven't yet said anything that would incriminate
you.
- b. Is requiring you to send the LEAF, which includes
information necessary to decrypt your conversation,
unconstitutional? Is it analogous to requiring someone to
give up his diaries, which under some circumstances has been
held to violate the fifth amendment?
- 4. Does it violate the constitutional right to privacy?
- a. This requires a balancing test.
- b. Whalen v Roe, which dealt with a state database of
patients using controlled drugs (and found it consitutional),
seems to imply that Clipper would pass, but ...
- c. Since the court is especially solicitous of privacy in
intimate affairs, and some marriages are conducted largely at
long distance, perhaps encryption of "intimate interactions"
over the net would be protected.
- d. Sounds pretty dubious--at least until virtual reality
gets a lot better.
I. Verisign is a new firm, marketing its product
as a way of facilitating the use of digital signatures rather than an
encryption approach. Nonetheless, it may be
very important for the spread of strong privacy. Information can be
found at: http://www.verisign.com/faqs/id_faq.html and, in much
briefer form, below.
II. How does a digital signature work?
- A. Assume (for the moment) that the recipient
knows your public key.
- B. A "hash function" converts the message into a
long string of digits--a large number. Call that the message's
"signature digest."
- C. You hash your message to create its signature
digest, then encrypt the digest with your private key. Send that
as the signature appended to your message.
- D. The recipient decrypts the encrypted digest
with your public key to get back the signature digest. He hashes
the message to calculate its signature digest for himself. He
checks to make sure they are the same.
- E. Thus a digital signature serves two functions.
It proves that you sent the message, and it proves that the
message has not been tampered with.
- F. In order to serve the second function, it must
be effectively impossible to start with a signature digest and
generate a message that hashes to that number. Otherwise someone
could intercept the message, hash it to calculate the signature
digest, replace it with another message that produced the same
digest, and send that (along with the original digital signature)
on to the recipient. Remember--the message here is signed but not
encrypted. The signature is supposed to authenticate both message
and sender.
- G. There are several hash functions that have
this characteristic.
II. So far we have assumed the recipient already
has the sender's public key. We now drop that
assumption. The sender can, of course, send the recipient his public
key--but how can he prove that public key XYZ really belongs to
person P? Until he does so, he cannot provide a digital
signature--and without a digital signature, he cannot prove that the
message is really being sent by him.
- A. Solution: Digital ID.
- 1. If there is someone you trust and whose public key you
know, then ...
- 2. Your correspondent sends his name and public key in a
document signed by that someone.
- 3. Once he has such a document--an ID linking him to his
public key and signed by someone who is trusted and whose
public key is widely known--he can make copies and use them to
identify himself not only to you but to anyone else who trusts
the signer and knows his public key.
- 4. We now have digital ID cards, which can be "shown"
simply by transmitting a message.
- 5. An Internet Driver's License--not in the sense of a
license required to drive but rather of something that, like a
driver's license, identifies you.
- 6. Note that the person you show your ID to must not only
have the signer's public key but also trust the signer to be
careful about checking that someone who asks for an ID is who
he says he is.
- 7. In a way, this is more like the old "letter of
introduction" than it is like a driver's license. There is no
single person "authorised" to give it out--it all depends whom
the recipient trusts.
- B. What if the sender and the recipient have no
common acquaintances?
- 1. I send you my digital ID signed by X (whom I know and
you don't), along with a digitally signed letter from Y (who
you know, and X knows, but I don't know) certifying that X is
reliable and that his public key is 41690... . Since you now
have X's public key, you can check his signature, thus verify
my ID.
- 2. You can run such a chain through many levels, with
multiple certificates, up to one completely trusted certifying
authority at the top.
- 3. Or up to two somewhat trusted certifying authorities,
since I can send you multiple certificates.
- 4. For each step, you need a way of both knowing that that
authority is reliable--that he checks people's identities
before issuing them ID's--and knowing his public key.
- a. You could get either fact, both, or neither from a
certifying authority one step higher
- b. You could get either, both, or neither from your own
knowledge. For example ...
- c. You may know that the boss of your company is
reliable about issuing ID's to employees
- d. But have to verify his public key via a certificate
from some certifying authority.
- e. What matters is that between his certificate and your
knowledge you have both facts.
III. What Verisign is doing:
- A. Providing software and hardware so that a firm
can set itself up as a certifying authority to certify keys inside
or outside of the firm.
- 1. In particular, the firm's private key must be known to
nobody
- 2. It resides inside tamperproof hardware, and is destroyed
if the hardware is opened. The hardware is available from
Verisign.
- 3. This is similar tothe way in which the family key is
protected under the Clipper Chip proposal. Nobody knows it--it
is simply enbedded in hardware which a law enforcement agency
can use.
- 4. And the firm's public key is certified by Verisign.
- B. Setting up as a top level certifying authority
itself.
- C. Establishing standards on how to be a
certifying authority.
IV. Who are they?
- A. A spinoff from RSA; the chairman of their
board is the president of RSA
- B. Their announcement on the net includes
favorable quotes from: Visa, Ameritech, Mitsubishi, AOL, Apple,
Netscape, Intel, Lotus, Microsoft, Sun, Dun and Bradstreet, GE Inf
services. Some of those firms are also investors.
- C. The service they provide is Making it possible
to prove that a public key belongs to an individual.
V. Details:
- A. A Digital ID contains, in addition to the name
and public key of the person being sertified, the expiration date
of the key, the name of the Certifying Authority that issued the
Digital ID, the serial number of the Digital ID, and perhaps other
information. And it includes the digital signature of the Digital
ID issuer-- a Certifying Authority.
- B. Note that your public key is generated by
you!
- 1. You take it to a certifying authority with your passport
or some other realworld ID to get your digital ID.
- a. How careful is the authority to make sure you are
you?
- b. That is up to them.
- c. And may help determine whether people accept the ID's
they issue.
- C. The certifying authority maintains a list of
cancelled keys.
- 1. So if your private key somehow gets out, you tell the
C.A. to cancel your keys.
- 2. If you get a signed and certified message and you want
to be really sure it is genuine, you check with the C.A. that
certified it (i.e. that issued the certificate linking the
sender to his public key) to make sure it has not been
cancelled.
- D. All keys expire after some time delay
(Verisign suggests two years).
- 1. To guard against long term attacks.
- 2. Because as computers get more powerful, you want to use
longer and longer keys to be safe.
VI. Is a digital signature legally valid? We don't
know yet.
- A. Utah has implemented laws qualifying digital
signatures. Legislation has been intorduced in California and New
York.
- B. You could sign a paper document agreeing to be
bound by your digital signature, and to accept the digital
signature of the person you are going to be doing business
with..
VII. Verisign can be viewed as a Trojan Horse for
Public Key Encryption!
- A. If you want to prevent widespread use of
public key encryption, the obvious weak point to attack is the
proces by which keys are made public.
- 1. It is hard to search every computer in the country, but
not so hard to close down whoever publishes the "public key
phone book" for the whole country.
- 2. Even if you cannot keep it from being published (perhaps
abroad), you can force it underground, and thus make it hard
for people to distinguish the real phone book from a bogus one
.
- B. Suppose Verisign succeeds in establishing
widespread institutions for digital ID's--allowing someone to
prove that a particular public key is his.
- 1. You want to correspond with a known person (i.e. you
know his name and EMail address) whose public key you do not
know. You email him asking for his public key.
- 2. He sends you his public key--along with a certificate
from a trusted CA (or chain of them) proving that it is his.
- 3. You can now send him encrypted messages safely.
- 4. We no longer need a public key phone book.
- C. So Verisign solves the verification problem,
in a fashion similar to the solution used by PGP. Decentralized
verification, since chains of CA's need not all go up to
Verisign.
- D. Why are they putting all of this in terms of
digital signatures?
- 1. Because there are obviously good reasons why lots of
firms would want a reliable system of digital signatures in
place.
- 2. Export of software for verifying digital signatures is
legal
- a. Provided that that software cannot be used for
encryption
- b. But the key element of strong privacy that they are
providing is not public key encryption but institutions for
verifying public keys
- c. And once those institutions exist, you can use
someone else encryption software (produced abroad or
exported illegally in violation of ITAR).
- 3. And nobody ever blew up a building with a digital
signature.
- a. Or in other words, although they potentially weaken
the power of law enforcement
- b. They do so in a less obvious manner than encryption
- c. And are thus harder to get people scared of.
- d. Very clever people these computer geeks.
- E. And now we see why the government wanted to
establish DSS instead of RSA as the standard for digital
signatures--because DSS does not also provide encryption
capabilities.
0: Guest Lecturer--Silicon Valley's Computer Cop.
Some bits.
- A. Very low police resources are devoted to
computer cimre.
- 1. 1-6 in Silicon Valley, the FBI has about seven people
specializing in computer crime with CS degrees or the like--and
they are not likely to be in the class of a Mitnick or Morris.
- 2. If you are a cop and known to know something about
computers, they assign you to search Dos computers--which is
not a very interesting job.
- 3. Is one reason for Clipper, Digital wiretaps, etc. that
law enforcement knows it is very weak and wants some
technological help>
- C. How do you prove evidence is real?
- 1. On machine the defendant doesn't control, you don't.
- 2. On machine he does control, better have witnesses.
I. Review Digisign
- A. The basic idea
- 1. Digital signature requires recipient knows public key;
it guarantees source and contents.
- 2. To use a certificate to tie a public key to a person
requires that you know the public key and reliability of the
signer.
- 3. Which you can get by some combination of your own
knowledge and attached certificates of him by higher level
C.A.'s
- B. The implementation
- 1. They offer the hardware, software, RSA license
- 2. Using existing standards for digital signatures based on
RSA
- 3. And will function as top level CA if you want them to.
- C. Why it matters: Level I
- 1. With digital signatures, business on the internet is
much easier.
- 2. And potential spoofing problems involving Web, News,
Mail can be dealt with.
- 3. And we can construct Newsgroups etc. where people are
not anonymous.
- D. Why it matters: Level II
- 1. The weak link in a system of public key encryption is
the phone book,
- 2. Because it is a big project, and if driven underground
hard to check.
- 3. Digital signatures are the trojan horse that gets
signature verification going, since ...
- 4. You can ask someone for his certified signature.
- 5. And harder to campaign against, which is
- 6. Why NSA tried to push DSS over RSA.
II. The new hole in Netscape Security?
- A. Random vs pseudo random numbers.
- 1. Turn on the computer from zilch, do everything the same,
it gives the same random number!
- 2. Solution one--physically random process, preferably
quantum mechanical, built in.
- 3. Solution two--stay pseudo, but seed the computer with a
number determined by outside input.
- 4. For instance, see it with the clock time at the instant
the user starts the process, hits a key, etc.
- B. How Netscape generated keys:
- 1. Seed was based on time in seconds, time in miliseconds
since the last second, process number, and ...
- 2. If you are on the same machine, watching the same clock,
when the message goes through ...
- 3. You observe seconds and aprox milliseconds, or get them
from the header of his message, get the other numbers from the
header or guess from a small set of alternatives (one of them
is often 1 on X-Windows machines).
- 4. Try out seeds until one works, and ...
- 5. Thereafter they are generating new keys without
reseeding!So having broken the encryption once, you can read
everything else he sends anyone that session.
- 6. On a different machine, it would probably harder, but if
the clocks are accurate ... .
- 7. In the worst case, there are 2 to the 47 possibilities
for seeds, instead of the 2 to the 128th possible keys. That
simplifies the problem roughly a billion billion million fold.
- C. Solution: Use a much wider range of seeds,
depending on harder to observe things.
- D. Lessons.
- 1. How long a brute force attack takes is not all you need
to know. By this attack, the 40 bit and 128 bit versions of
Netscape encryption were about equally easy to crack.
- 2. The virtues of open examination. If people can look for
holes, they may find them.
- 3. Will Netscape open the new version?
III. Economic espionage problem?
- A. Former soviet? Unlikely. Too far behind, but
...
- B. Their unemployed spies, working for western
companies, maybe.
- C. Intelligence agencies of allies etc. are a
more likely threat. If so, encryption good enough to protect only
against amateurs (40 bit, for instance) is not adequate.
IV. DSS--was the trap door intentional?
V. Is ITAR constitutional?
- A. Is it a prior restraint on publication?
- B. If so, ...Only restrictions on classified, or
official information can be made
- C. And should be court reviewable etc.
I. Lund v Commonwealth of
Virginia 232 S.E. 2d 745 (Va. 1977); SC of
VA
- A. Larceny had to be of property, not of
services.
- 1. Old case, running a factory you do not own, someone else
pays the wages (to employees who think this is ordinary
overtime?), you buy inputs, sell product.
- 2. Some states changed laws to cover services.
- B. The only property stolen consisted of
printouts and cards.
- 1. How do you value them?
- 2. Cost of production?
- a. Do you price it as funny money (numbers used for
internal accounting--each department gets a budget of $X to
be spent only on buying computer time), or
- b. Try to estimate a real shadow price--the cost your
use imposed on the owner.
- 3. Value to the thief?
- 4. Value to anyone else is aprox zero.
- 5. The issue of valuation comes back in the Bellsouth case.
- C. Why prosecute? Why use criminal rather than
civil law? VPI gave him his degree, so presumably they don't
strongly disapprove of what he did.
II. United States v.
Seidlitz 589 F.2d 152 (4th Cir.
1978)
- A. How do you apply ordinary law to computer
contexts?
- 1. "by means of false or fraudulent pretenses,
representations, or promises ..." (Fed wire fraud statute).
Representations to whom? Fraud implies human being fooled, no?
Is picking a combination lock fraud?
- 2. Was the owner of the computer "a party to" the phone
calls between Seidlitz and their machine?
- B. Was he guilty? Was he stealing or testing for
security holes?
- 1. He could have walked off with Wylbur more easily while
still working for them, presumably?
- 2. Is Wylbur really a secret, or protected by contract?
Lots of copies presumably.
- 3. What does he want hard copy for? 1975 technology.
- 4. Are all crimes crimes? I still have keys to U of Chicago
Law School, because I never bothered to return them. If I am
there over break and use them to go check a cite in the
library, I may be technically trespassing (even breaking and
entering?), but nobody associated with the institution would
think of it that way.
III. United States v
Jones 553 F. 2d 351 (4th Cir.
1977)
- A. Fraud or forgery?
- 1. If real checks obtained by fraud, she is guilty, but ...
- 2. If forged checks, she is innocent (of this crime)
- B. The computer "really issued" the checks so
they were real checks fraudulently obtained?
- C. Or the computer was the tool Everston used to
forge the checks, by changing the vendor number (before the checks
were printed).
- D. Is the computer a tool used by Everston
(district court's view) or did he defraud the computer, or Inglis,
into writing a real check?
- E. What if no human intervened between the fraud
and the writing?
IV. The People of New York
v. Robert Versaggi
- A. Did he "intentionally alter ... computer data
or a computer program?" No.
- 1. The data and the program were untouched
- 2. Do I alter my home wiring when I turn on the lights?
- B. He misused the computer.
I. If someone wants to do a paper on the
pretensions of the Attorney General of Minnesota to rule the
internet, some interesting questions might be:
- A. Is it legal?
- 1. Scienter--if you cannot know that you are dealing with
someone in Mn, can you be guilty of knowing that you were
violating their law?
- 2. What if he ought to know that what he is doing
(gambling) is illegal for most of his customers?
- B. What are the realworld implications if the Mn
claims are correct? What would the net look like?
- C. How might the law be changed to deal with such
problems?
- 1. Forced identification
- 2. New jurisdictional rules
- 3. Treat "telecommuting" like physical transportation.
Citizens of Minnesota are free to go gamble in Las Vegas.
- D. Is this a new problem?
- 1. Telephones
- 2. Mail forwarding
II. The Hacker Crackdown: The sociology of
computer crime
- A. What the cops know that ain't so:
- 1. The Bell crash
- a. Why should they believe the official version; perhaps
- b. It was designed to prevent embarassment, or
- c. avoid encouraging a repeat.
- 2. Hacker self-glorificationf also encourages police
suspicions.
- B. Illegal industries are poorly known; paranoia
may reign
- C. "Hacker" and "Hacks."
- 1. A "hack" was an odd and ingenious way of doing/designing
something.
- 2. A programmer seeing his first elephant might describe it
as an impressive hack--"it picks up things how!?!"A Elephant.
- 3. The word "Narc" has had a similar change through back
etymology (deducing its meaning from a false guess at its
origin). It originally meant a police spy, possibly from a
Romany word for "nose." People heard it as narc=nark=narcotics
agent.
- 4. And "hacking" now implies something illegal and possibly
destructive.
- D. Is it wrong? Whistling down the wire. Driving
a car backwards.
- E. Consider ways of enforcing rules in
hacker/phone freaker culture: instead of executing someone you
turn him in to the police. Your weapon is information rather than
force.
- F. Believing what people want to believe:
- 1. Hackers want to believe they do no harm, computers are
overpriced, ...
- 2. Police want to believe that hackers are thieves.
- 3. And that markets are organizations: Mafia. You can fight
an organization.
- G. The effect of law enforcement on criminal
industries
- 1. Drugs
- 2. Luciano: by an account attributed to him, the reason
criminal firms joined together during prohibition was to
cooperate in bribing police and judges.
- H. The files that were on the BBS's the police
seized--should they have been there? similar information (on how
to commit crimes) is in books sold openly. Should such be banned?
Is the 1st amendment a good idea?
- I. Sting operations don't require phone
taps.
- J. There may be a delicate balance between not
treating hackers badly enough to make them want to commit
largescale vandalism and not treating them well enough to make
hacking an attractive and popular hobby.
III. U.S. v. Robert
Riggs (and Craig Neidorf)
- A. What happened:
- 1. Riggs copied a text document from a Bell South
computer--it had a secrecy warning at the top.
- 2. Neidorff edited it and published it in Phrack
- 3. The file zipped back and forth across state lines in the
process.
- B. Is it fraud?
- 1. If you get the file by calling up a secretary and
persuading her that you are someone she is supposed to send it
to, then yes.
- 2. If you get it by "persuading" a combination lock to let
you in at night to steal the document, no.
- 3. This case is somewhere between the two.
- C. No fiduciary relationship between Riggs and
Bell South--is this deprivation of an intangible right (requires a
fiduciary relationship to be fraud) or of property (does
not)?
- D. Is what was transmitted a thing? "goods, wares
or merchandise."
- 1. Transmitting money counts.
- 2. Does transmitting Bell's proprietary information count
as transmitting "goods, ... ?" Yes, if affixed to a tangible
medium. What if the medium is not stolen (U.S. v. Brown below)?
- 3. This court holds that it counts even if what was
transmitted is not tangible--others have disagreed.
- 4. Anyway--the court says that computer storage is more
like paper than like memory.
- 5.But no computer was transported--just the information.
What if trade secret information had been transmitted by a
phone conversation?
- E. Is what was transferred the type of property
that can be "stolen, converted or taken by fraud?"
- F. Dowling v
U.S. holds that copyright violation does
not qualify. Pirated records (from broadcasts), when transported
across state lines, are not subject to the Federal Stolen Property
statute. The criminal owned the records, and the intellectual
property was not "goods, wares or merchandise."
- G. This court holds that trade secret can be
stolen even if copyright can only be infringed.
- H. Are "Hacker" and "Legion of Doom" prejudicial
terms?
- I. The real story:
- 1. The Sysop of the BBS on which the document was stored
reported it, the informationg got to Southern Bell security
months before the document's publication, and nothing was done.
- 2. Every document produced by Southern Bell for internal
use had the same secrecy warning.
- 3. The information was administrative, not technical. Maybe
helpful for verbal fraud, but ...
- 4. Southern Bell sold a document with more extensive
information to anyone for $13.
- J. Is there a first amendment issue here?
Neidorff is being charged with publishing something, not stealing
it.
- K. Law enforcement was trying to "send a
message"--discourage communication of information useful in
committing crimes--i.e.
- 1. BBS's and electronic journals that published
- 2. Other people's phone ID and credit card numbers
- 3. Information on how to get such things and
- 4. Information on how to get entry to other people's
computers.
- 5. Worried, in this case, that the information might be
used to sabotage the 911 system.
- L. Is this a legitimate objective? Books on "how
to get even,", "Anarchist's Cookbook," etc. are protected by the
First Amendment. They tried to achieve it by harassment--siezing
computer systems as evidence and then (in many cases) neither
indicting people nor returning the seized systems.
- M. Does Neidorff have legal recourse against the
Feds? Against Bell South (which claimed the document was worth
$80,000). Should he? Is Bell guilty of abuse of process?
- 1. Bell's figure was a (much inflated?) estimate of the
cost of producing the document. But ...
- 2. They still had it. The relevant figure ought to be
either
- 3. The cost to them of the info getting out or the value to
the thief.
- 4. Both were trivial, since they were freely selling a
document containing the same information.
IV. Unix source code cases. 1990.
- A. Law enforcement was following a trail from BBS
to BBS via EMail
- B. Found a respectable computer consultant, who
...
- C. Had lots of Unix source code on his computer,
and traded with similar people to get more.
- D. Is he a criminal with millions of dollars of
stolen property?
- 1. Obviously not--Bell can sue if there is any damage
(there probably is not).
- 2. If Bell does sue, they may find it harder to get
contractors.
- E. Typically in such a case his equipment is
seized but no charges are ever made! (Not here)
- F. What is happening? It is useful to have source
code when doing subcontracting work. Bell is restrictive about
giving it out. So A has source code for one part of Unix, B has it
for another, they trade in order that neither has to persuade Bell
to give him the code when and if he actually needs it.
(conjecture)
- G. Should Bell worry about such behavior? If
intellectual property is easy to steal, the best strategy may be
to enforce your rights against people who misuse it, not against
ones who simply have it.
- H. If Bell gets people like this thrown in jail,
they may find it harder to hire consultants next time.
- I. Conjecture: Bell's right hand did not know
what its left hand was doing. Bell Telephone people are trying to
intimidate hackers and phone phreaks; Bell Unix people would not
want their consultants arrested for harmless acts, even if they
are iin violation of contract.\
VI. Review: Issues raised by the criminal
cases.
- A. What is computer fraud? Is using a fake
password analogous to lying to a person or to opening someone
else's combination lock?
- B. What is information?
- 1. In what sense does copying it deprive the owner of it?
- 2. Does transporting it count as transporting "goods, wares
or merchandise?"
- a. Does it depend on the form in which it is
transported--phone call or diskette?
- b. Does it depend on who owns the diskette?
- c. Does it depend on whether it is copyrighted or a
trade secret?
- 3. Can it be "stolen, converted or taken by fraud?"
- 4. How do you measure its value?
- a. Production cost? Clearly wrong.
- b. Value to thief--perhaps, or ...
- c. Cost to victim.
- d. Note that the Neidorff case is a clear example of why
the victim might lie about the value.
VII. Steve Jackson case:
- A. The facts:
- 1. Steve Jackson Games had only a tenuous connection to
anything illegal--an employee (Blankenship, aka Mentor) had
Phrack on his board (Phoenix), and had posted talk about a
planned decryption service.
- 2. Phoenix was an ideological board with no stolen credit
cards, codes, etc., on it. Blankenship invited telephone people
to come on the board and argue with hackers.
- 3. Kluepful is the guy who was told about the 911 document
and chose not to do anything about it.
- 4. Izenburg ran a BBS on his own machine, connected with
Terminus. Foley urged him to admit all sorts of things, none of
them true. He let them seize his machine as evidence; he was
charged with nothing. Six months later he asked about getting
it back, as of two years later he still had not done so.
- 5. "Erik Bloodaxe," co sys-op of Phoenix, had his stuff
seized. Two years later, they still had it. No charge.
- 6. Secret Service raided Phoenix, seized everything, no
charges, two years later still had it--including his wife's
thesis. No copy provided.
- 7. They also raided Steve Jackson Games, apparently on the
theory that Jackson's computers might have evidence and that
Mentor might be able to erase it if warned.
- a. Seized work in progress,
- b. Working equipment,
- c. Files of BBS EMail system.
- d. Nobody was charged with anything.
- e. Warrant was bogus--alleged false statement linking
SJG to E911 case, attributed to Kluepfel, who denied it.
- 8. GURPS Cyberpunk was the almost finished manuscript
seized. A book, not a game.
- 9. The next day, Jackson asked for the book back, so he
could finish publishing it. Was told by an agent that it was a
"manual for computer crime." Later told the same thing by other
agents. He was not told about the connection to the 911
document case until months later when the warrant was unsealed.
- a. This gave everyone reason to believe for some months
that it was a prior restraint case.
- b. And leaves suspicion that it really was a prior
restraint case--that once the Secret Service saw GURPS
Cyberpunk, they concluded that Steve Jackson was a bad guy
and ought to be treated accordingly.
- c. Supposedly, Secret Service still uses stuff from
Cypherpunk in their computer crime training video.
- 10. The secret Service claims they did not know of either
the Privacy Protection Act or that SJG was a publisher.
- 11. They were, however, informed of the latter fact during
the raid
- 12. And of both the next day.
- 13. The Secret Service refused to give copies of anything,
did not return the bulk of what was seized for almost four
months.
- B. The Verdict
- 1. The Secret Service was not liable for defects in the
initial warrant application
- 2. Not liable for damages for the first day of the seizure
- 3. Liable for damages for the rest of the year for
violation of Privacy Protection act in searching a publisher
- 4. Not liable for damages thereafter because because they
were balanced by SJG's gains from publicity
- 5. Not a violation of the Wire and Electronic
Communications ... act interception rules because reading a
stored message is not interception,
- 6. But it does violate the "Stored Wire and Electronic
Communications and Transactional Records Access" procedures set
out by the act for getting access to stored communications, so
$1000/plaintiff statutory damages. What about the other 362
plaintiffs? Collatoral estoppal does not apply to the
government, so presumably they all have to litigate if they
want their $1000/plaintiff.
- C. Some narrow legal issues:
- 1. No damages for day 1, ordinary damages for the next 4
months. Should it be ordinary damages for day 1 (negligence),
punitive thereafter (deliberate violation of the law)? But the
U.S. waiver of sovereign immunity does not cover punitive
damages, so you cannot get them against the government.
- 2. SJG got damages only for 1990, because of their later
gain from publicity. Is that offset proper? If I buy a SJ game
out of sympathy for Steve Jackson as a victim of the Secret
Service, do I really intend the money to go to reduce their
civil liability to him?
- 3. Suppose the victim was not a publisher--does the Secret
Service have a legal obligation to minimize damage? Consider
the wife's thesis. Would it be possible to institute a class
action covering all such cases, asking for an injunction
requiring the Secret Service to institute a policy of prompt
copying and return? The data, not the computer, is the
evidence.
- 4. What about requiring them to permit victim to copy?
Greenhouse?
- D. Broader issue: Is the whole campaign in
violation of the first amendment?
- 1. The objective was to suppress the distribution of
information in order to suppress crime.
- 2. When is that legal?
- E. Still broader issue: What are the limits on
police imposition of costs instead of indictment?
- 1.Old style: beat someone up.
- 2. Arrest someone, let him cool off in jail over night,
drop charges.
- 3. Seize a computer.
- 4. All of them work because the police have the threat
that, if the victim objects, they could impose larger costs, at
some cost to themselves (indict, jail, maybe lose the case).
- 5. Is this a bad thing, or a necessary adjustment to the
real world?
- 6. How might it be prevented or limited? Should police or
police departments be liable, and when?
- 7. Would that give police an incentive to make dubious
indictments instead of dubious seizures?
- 8. What about obligated to use the least costly method?
- F. Sociology issue: "Those Kids aren't
Criminals"
- 1. How does, or should, the law separate acts by those
trying to make money, do injury, etc. from equally illegal acts
done as a prank, etc.?
- 2. By age--asking for child drug runners.
- 3. By intent--criminal copyright infringement example.
- 4. Moral sanction as one form of deterrence--breaks.
- 5. Partly a problem here because of inverted hierarchy of
age/expertise
- 6. Leopold and Loeb?
VIII. "Sending a Message"
- A. If the message was designed to deter legal but
wicked acts ...
- B. Is sending it unconstitutional?
- C. Steve Jackson--does the SS still think
Cyberpunk Hackers is computer crime? They apparently (asserted by
a correspondent) still use it in their training videos.
- D. Why didn't they give back the computers, or
their contents?
- 1. To restrain publication
- 2. To impose costs
- 3. Because they did not want to admit they hadn't found
anything?
- E. How do you stop it?
- 1. Make police civilly liable for costs they impose on
someone if the victim is not indicted? But that gives them an
incentive to indict innocent people, imposing still more costs.
- 2. If not convicted? but that gives them an incentive to
convict innocent people.
- 3. If the police do not get their evidence in the least
costly practical way.
- 4. Note the intimidation problem--victims may not protest
if the police could impose still larger costs on them. Consider
our computer cop's comment about how cooperative ISP/BBS types
are.
IV. Sociology issue: "Those Kids aren't
Criminals"
- 1. How does, or should, the law separate acts by those trying
to make money, do injury, etc. from equally illegal acts done as a
prank, etc.?
- 2. By age--that is asking for child drug runners.
- 3. By intent--for example, copyright infringement is not
criminal if not done for profit.
- 4. Moral sanction as one form of deterrence.
- 5. Partly a problem here because of inverted hierarchy of
age/expertise. The kids know enough more than the grownups to be
able to do serious damage.
- 6. Leopold and Loeb? Those kids' prank consisted of murder.
V. Odds and Ends:
- A. Cancelmoose.
- 1. Explain:
- a. Spamming: The practice of sending one posting, often
commercial, to many unrelated news groups.
- b. Cancelling: A poser can cancel his post on some
machines. So can someone else, by forging a cancel message
from the original poster.
- c. The Moose: Someone has software that he uses to
routinely cancel spammed posts.
- 2. Consider it as a non-legal defense against a legal but
undesirable practice.
- 3. Is it an illegal act?
- a. forgery? Who are you fooling?
- b. misrepresentation, defamation, injurious falsehood.
- c. Unauthorised use?
- d. Note: Sysop of a machine chooses (or doesn't) to
permit automated cancels on his machine.
- 4. Alternative: Moose filter? Some way so that those who
approved of the Moose's activity could follow his instructions
to cancel, those who did not could ignore them, with the Moose
using a digital signature to prove messages were from him.
- 5. Apparently Moose is doing something along these lines in
his new software.
"However, the problem is becoming moot, as
Cancelmoose(tm) (moose@cm.org) has devised a new mechanism, called
NoCeM, that will let you set your system to respond however you'd
like to PGP-signed requests from people you authorize, where
responses include things like not showing you the postings
andshowing you only the postings that are named in the requests.
So you can get a lot more control over spam without having to open
your system up to forgeable cancels."
According to Restatement (2d) of Torts sec.
623A,
"One who publishes a false statement
harmful to the interests of another is subject to liability for
pecuniary loss resulting to the other if
(a) he intends for publication of the statement to
result in harm to interests of the other having a pecuniary value, or
either recognizes or should recognize that it is likely to do so,
and
(b) he knows that the statement is false or acts in
reckless disregard of its truth or falsity."
I: The old nightmare: Computers as the end of
privacy.
- A. Two forks: Public (government information
about you) and private (credit bureaus etc.).
- B. Privacy (should there be limitations on true information?
Probably true info?)
- 1. Is privacy more or less of a problem now than in the
past?
- a. Much less--consider small town vs city. The computer
only partly reverses this ...
- b. The modern credit agency knows less about you than
your neighbors would have known.
- c. Does freedom and autonomy include a right to commit
fraud? What about "social fraud"--being friendly with people
who would not be if they knew the truth about you?
- d. Is having information with some errors more unfair
than if the information was not there at all, so that people
would be treated the same independent of the truth about
them?
- e. Does a right not to have misleading information in
your file improve the accuracy of the reporting system?
There may be useful information that they cannot prove is
true.
- 2. If there is a less privacy, is that a bad thing?
- a. Better information technology leads to a decrease in
privacy.
- b. But not necessarily to an increase in the risk of
being damaged by false information about oneself. Better
technology may make it easier to distinguish true
information from false. Computerised credit reports are
likely to be much more reliable than gossip.
- c. Does an improvement in information technology lead to
a shift in power? If so, is it from those who do not have
information to those who do, or from bad guys to their
potential victims? If an employer can find out which actual
and potential employees are good and which bad, does that
mean that the employer has more power over workers, or that
good workers have increased power to get rewarded for
working well, and bad workers less power to get rewarded for
working badly? Similarly, does improved credit information
mean a transfer of power from borrowers to lenders, or from
deadbeats to people who pay their bills?
- d. There are at least two potential ways of dealing with
privacy problems. One is to restrict the gathering,
distribution and use of information about people. The other
is to permit people to do things, such as encrypting their
communications, that keep the information from ever getting
out.
- C. Can one effectively control the use of credit
information--even if the law says you can? How can you have it
accessible enough to be useful and still enforce rules saying that
only those with a legitimate use can get it?
II. Public Fork:
- A. Merriken v.
Cressman: A school drug prevention
program
- 1. The proposal:
- a. Collect lots of personal information about kids
- b. Without getting informed consent from parents
- c. And use it to figure out which ones are at risk of
drug use, in order to
- d. Take preventive action.
- 2. Arguments against it:
- a. If you decide a kid is likely to use drugs, that may
be a self-fulfilling prophecy or lead to scapegoating by
other children.
- b. Gathering the information may be a violation of
family privacy and the child's loyalty
- c. "Preventive action" means incompetent psychotherapy
by amateurs
- d. Inadequate precautions to keep the information
private.
- 3. Constitutional issues:
- a. Privacy, freedom of speech, etc.
- b. No consent to waiver of rights (even supposing they
are waivable).
- c. A balancing test is appropriate, but goes heavily
against permitting the program.
- B. Robert P. Whalen v.
Richard Roe. Can NY state maintain a file
of names and addresses of those who have gotten a prescription for
controlled substances?
- 1. Precautions by state--barbed wire, locks, 17 people have
access, 24 might get it.
- 2. After 20 months, data had been used in two
investigations.
- 3. District court enjoined enforcement of that part of the
statute as a needlessly broad infringement on the privacy of
patients.
- 4. Supreme Court: Legislation that has some effect on
liberty or privacy need only be a reasonable attempt to achieve
a legitimate state goal; it cannot be enjoined just because the
court thinks it is unnecessary.
III. Rogan v City of Los
Angeles
- A. Against a municipality, must show
- 1. deprivation of protected interest
- 2. due to an official policy etc.
- B. Erroneous information, identifying Rogan as a
wanted murder suspect. Suspect (escapee) had gotten his birth
certificate.
- 1. Readily available information that would avoid confusion
was not included. Real suspects physical characteristics. Also
a bulletin, more narrowly distributed, with that info.
- 2. Rogan arrested, held, checked, wrong man, released. five
times.
- 3. The information was repeatedly reentered, as per policy,
with no checking.
- C. Plaintiff deprived of rights because
- 1. NCIC record violated fourth amendment particular
description requirement.
- 2. Maintanance and reentry caused further arrests without
due process of law.
- D. By policy of L.A.?
- 1. Police officers were not trained in how to amend
information in the system or the need to do so.
- 2. They did not even know it was possible to do so, nor did
they consider doing so after initial misidentification
incidents.
- 3. Crotsley had a policy, inconvenient to victim, for
dealing with such situations.
- E. Result
- 1. L.A. is liable.
- 2. Officers are not because of qualified immunity.
IV. Private fork: Thompson
v San Antonio Retail Merchants Association
- A. Automatic capture of information--strengths
and weaknesses.
- 1. Cheap and easy way of adding information, but ...
- 2. Individual merchant may be careless, since he does not
pay costs of error.
- 3. Did in fact misidentify one William D. Thompson (bad
debt) with another and
- 4. Wards denied the latter credit.
- 5. He thought it was because of a recent past felony
conviction for burglary (probation)
- 6. Took a lot of trouble and a court suit to get them to
fix their records
- B. Is SARMA testifying to facts or transmitting
them?
- 1. Suppose the information was added to the data base with
a note of its source? Can SARMA then shift the blame and the
liability to the merchant who reported the information?
- 2. Is it libel to report another's libel? Yes, often. So
shifting the liability will not work--they will both be liable.
- C. FCRA imposes duty of reasonable care--which
was not met here.
- D. Damages. $10,000+costs
- 1. Humiliation and and mental distress, because ...
- 2. He was falsely suspected of reneging on a $77 debt, when
...
- 3. He was in fact only a convicted felon!
- 4. One suspects punitive etc. motives in the court.
V. Fair Credit reporting act.
- A. [[paragraph]] 609 Disclosures
- 1. Credit agency must provide subject with all nonmedical
information it has on him,
- 2. Provide him the sources except for investigative
consumer reports, and
- 3. Tell him who the recipients of the information are.
- 4. The act immunizes credit bureaus against defamation
suits and the like, except for violating specific provisions or
acting with malice.
- B. [[paragraph]] 611 Procedure for disputing and
recording disputes, and correction.
- C. [[paragraph]] 613 Public Record Information
for employment
- 1. Information goes to court or grand jury, or anyone the
subject wants it to go to, or to anyone with a legitimate
business purpose in connection with a transaction involving
that subject.
- 2. Legal rules on when information becomes obsolete.
- a. Why? Bankruptcy 10 yrs.
- b. Only applies to small transactions.
III. Obscenity on-line
- A. Obscenity vs Indecency
- 1. Obscene--It is constitutional to forbid people in
general from reading it. A work is obscene if:
- a. The average person, defined by community standards,
would find that the work as a whole appeals to the pruriant
interest, and
- b. The work depicts or describes, in a patently
offensive way, sexual conduct specifically defined by the
applicable state law, and
- c. The work, taken as a whole, lacks serious literary,
artistic, political or scientific value. Not defined
by local community standards.
- 2. Indecent--may be kept from to children, but not from
adults (except to the extent that keeping it from adults is an
unavoidable consequence of keeping it from children).
- 3. Sable Communications v FCC:
- a. The invention of Dial-a-porn resulted in a series of
acts, regulations, suits on how much the providers could be
constrained in order to protect children.
- b. 1988 act--total ban on both obscene and indecent, no
FCC regulations to limit serving children required.
- c. The court held that banning obscene speech is
constitutional, even though the standard of obscenity will
vary from place to place. The burden is on the provider to
tailor its product accordingly.
- d. Indecent speech is protected by the 1st amendment, so
the law must restrict it more narrowly to only protect
children. That part of the 1988 act is unconstitutional.
- e. FCC v Pacifica--banned dirty words only by
time of day. And broadcasting "can intrude on privacy
without prior warning as to program content, uniquely
accessible to children, even those too young to read."
"Captive audience, ... unwilling listeners." So the case for
regulating radio is stronger than for regulating telephone
conversations.
- f. Alternatives suggested by the FCC: require credit
card, an access code obtained by providing proof of age, or
a scrambler only sold to adults.
- g. Dissent (Brennan, Marshall, Stevens) holds that
imposing criminal penalties for distributing obscene
material to consenting adults is constitutionally
intolerable, because of the vagueness of the definition of
obscene, hence chilling effect.
- B. How does this apply to networks?
- 1. The act applies to anyone who : "Makes obscene
communication by means of telephone for commercial purpose" or
"permits any telephone facility under such person's control to
be used for ..." That might apply to Compuserve EMail and other
forms of electronic communication to people using modems over
telephone lines. Even if the EMail or posting is not made for
commercial purposes, you could argue that its transmission (by
Compuserve, or a commercial net access provider) is.
- 2. A future act directed at networks would raise
constitutional issues similar to those in Sable. Obscene
could be prohibited but indecent could probably not be if less
extreme alternatives were available. The difficulty of dealing
with a multitude of community standards would not prevent such
an act.
- 3. Currently, alt.talk.sex is not under the act because it
is not commercial. But access providers arguably are!
- C. Technical issue--possible control and
associated liability.
- 1. Posters must identify themselves via digital signature.
Require the poster of indecent or obscene material to restrict
his post to "World minus K12," or "World minus Prudes."
- 2. Posting machine--require users to label posts as "adult
only," "PG," ... ?
- 3. Receiving machine--require it to provide its community
standards in a way that makes it possible for posters to
include it out if their posts would offend its community.
- 4. Who defines the code--courts? Internet standards
committee?
- 5. What about an owned information utility such as
Compuserve or AOL? Must they require age evidence for
customers?
I. Why it
matters:Interactive Services Association: Not
just obscenity. defamation, franchise, real estate laws, ...
II. Tangibility, means of transmission:
Does law apply only to tangible objects: U.S. v.
Carlin only case to interpret. Phone sex case under. USC 1465
"facility or means of interstate commerce."!= "any means of
communication?" as judge instructed jury. Congress could have added
computer, phone terms to statute, did for child porn, did not here
when revised. AG had given the opinion that 1465 did not cover phone
transmission. not "by private conveyance" (section the govt chose to
rely on)
[prosecution denies tangible, claims Carlin was
wrong]
III. Who transported it?
Transfer initiated by customer. like Buying
book and bringing it home. They paid for the call. Civil
analog.
IV. What is the relevant community?
CA? Local police had seized, looked at, released. Not
child porn.
Computer community. World? Customers of that
BBS?
"the states of a legitimate interest in prohibiting
dissemination or exhibition of obscene material when the mode of
dissemination carries with it a significant danger of offending the
sensibilities of unwilling recipients or of exposure to juveniles."
Miller v CA 1973.
Community is users. So no need for laws--if violated,
users go elsewhere.
EFF argues that ... No impact on the local community.
Like reading a book. Can screen out children. Much better
filtering.
Miller court said community standard rule might
result in "some possible incidental effect on the flow of [otherwise
protected] materials across state lines," acceptable because only
"incidental." This is more than incidental. So District court should
have weighed chilling effect against Tennessee interest--case of
first impression.
V. Could they guard against?
- 1. Descriptions free and obscene. (but not themselves
appealing to prurient ...)
- 2. Thomas had to call inspector in Tennessee to acknowledge
receipt of membership application. But did not know thereafter
where inspector was calling from!
- 3. Inspector claims that Thomas' knew he was sending them
child porn, Mr. Thomas denied it.
- 4. No harder than Sable, but ...
- 5. Precedent for the Internet?
VI. Reasons for special laws:
This is an expecially good first amendment
medium--low entry barriers, interactive, ... Open to unpopular
speakers--easily chilled.
Could use electronic community, or could prosecute
the buyer, who affirmatively acts to bring material into his
community.
Or balancing test according to how much is obscene
where, and how easy to bar from particular places.
ACLU: stream of 1s and 0s en route, only became
obscenity in the receiver's house. Expand Stanley v Georgia.
EMail on Thomas' network.
Scanned pictures.
Transport for purpose of sale or distribution? After
sale.
Child porn frame: mailed magazines to Mr. Thomas,
watched Mrs. pick up the envelope, followed her home, executed search
warrant. Acquittal on child porn charges.
Jury was shown or told about lots of stuff not
included in the charges. inflammatory.
Acceptance of responsibility.
Also mailed videotapes.
Misleading advertising.
Should they have used the Sable statute?
Will this become irrelevant with internet from
Netherlands?
I. Review of U.S. v Thomas and related
stuff.
- A. Applying Miller: Relevant community
standards?
- B. Revising Miller for a new technology.
II. Odds and ends.
- A. Teens on AOL
- B. Under CDA who is the offender? Does it depend
on ...
III. Computer Crime:
- A. Jerry Schneider and Pacific Tel. Get into
order system. Stole/ordered equipment. 40 days in jail. Computer
security consultatn.
- B. Stanley Mark Rivkin. Working on backup system
for a bank wire room.
- 1. Authorized employee wit code system--on a piece of paper
in te wire room.
- 2. Called, identified imself as from Intl div, requested
10.2 Million to is account in NY, to Swiss bank.
- 3. Russalmaz got telegram "from" head of the wire room,
identifying Lon stein as representative, purcahsing diamonds
for te bank.
- 3. Stein got baggage ticket, flew to luxembourg, looked at
pack--diamonds.
- 4. Told is attorney wo had come up with diamond idea,
attorney went to FBI
- 5. Tried to get acquantance to sell diamonds for him, news
story, acquaintance went to FBI
- 6. Asked acquaintance to mail money back to another friend.
FBI followed, found him.
- 7. Out on bail, got someone to try to make relevant contact
for a repeat--with an underround FBI agent. 8 year sentence.
- 8. Expert in computer, not crime. Posturin?
- C. 75% by employees.
- D. Fry Guy. 1989
- 1. Call customer of credit Systems of America--credit card
numbers and credit info. Get customers ID info by claimin to be
from CSA: acct # and password.
- 2. Called in as customer, wandered around, ot staff area,
found local resident with valid credit card. So far could
have done without computer.
- 3. Rerouted victim's incoming calls to phone booth in
Paducah, from there to him.
- 4. Called Western Union, wired $687 to its Paducah to be
transferred to a friend, gave victim's credit card. Tey called
back to confirm. Confirmed. Reprogrammed everything.
- 5. Repeat with another victim.
- 6. Phone hacking. But could also reverse--change phone
number in relevant records to booth.
- E. Captain Zap:
- 1. Hack into credit agency, create good credit rating for
an imaginary company.
- 2. Hack into supplier, create real-world paper trail. cut
order, pay invoice, write delivery manifest, deliver to a mail
drop.
- 3. Caught by connection to the mail drop.
- 4.$500,000
- 5. Plea barain to $1000 fine + 2 1/2 yrs probation. 1981
- 6. State laws thereafter, fed started 1986
- F. First worm at PARC. To do housekeeping. Left
it one night, found it all over the place, killed it, abandoned
the project.
- G. Virus blackmail?
- 1. Junk mail diskette with unique license--threat. in small
print. To list of a UK magazine.
- 2. Info on AIDS, interactive.
- 3. Counted bootups, after 90 started encrypting files and
hiding programs.
- 4. Asked for money to Panama City address.
- 5. Did considerable damage.
- 6. Attempt to call number coincided with U.s. invasion of
Panama. Marine answered.
- 7. Bogus Nigerian businessmen.
- 8. Caught man because crazy. Company seal in his bags.
Amsterdam police.
- 9. Unfit to stand trial. Or a legitimate business device.
(disabling proram precedent)
- 10. Extradited to Britain. Got crazy enough not to be
tried.
- 11. A million disks in his house.
- 12. Would it work if done intelligently?
- 13. Against one corporation? How you could do it:
- a. Subvert company, sell short.
- b. Time bomb customers, blackmail company.
- c. Are we too late?
- H. Market in Computer Crime?
- 1. Need division of labor.
- 2. Offshore data havens.
- 3. Market in live credit cards.
- 4. In free long distance: sidewalk enterprise, free calls
on pay or cellular.
- 5. $1.4 million in 4 days against one PBX. $10/call?
- I. Leslie Lynn Doucette. Hacker service
industry.
- 1. Gets a number from someone over the phone.
- 2. Check it by hacking or calling a chat line phone number.
- 3. PBX has a long distance from 800 option. Use for
communication.
- 4. Voice mail computers as bulletin boards.
- a. Hacker boards were known, monitored--credit cards
could be cancelled.
- b. Find an empty box in a voice mail system, use it. Low
security because ...
- c. Leave lists of verified codes.
- d. Subordinates pick up, get money, send to her.
- 5. Real estate man found his voice mail system overloaded
with free riders.
- 6.Secret Service had tip about Doucette from Canada
(convicted, left)
- 7. Informants said Chicago.
- 8. Dialed Number Recorder on her phone.
- 9. Then on her 5 major subordinates.
- 10. Plea Bargain, 27 months, 1990. claimed $1.6 million in
losses.
- J. Citibank hack. N.Y. recent. EFT
intercept.
- 1. On Telenet. Trial and error found addresses for a bunch
of Citinet banks.
- 2. Found a computer that might be for EFT, got in through
default password left active, created program to log all
transmissions to their file.
- 3. Next day logged on, bingo. Captured hundreds of
transactions, vanished w/o a trace.
- 4. Opened a numbered Swiss Account. Got birth certificates,
new ID and SS#
- 5. Opened accounts at six anks in Houston and Dallas.
- 6. Rigged Citicorp computer to send to their Telenet
terminal, collected, returned acknowledgement. Real transfers.
- 7. Then transferred the money to the Swiss bank, then
withdrew to U.S. accts $7,333 each (below notice requirement).
- 8. End of week each get $66,000.
- 9. Citibank denies. Is it true? Posted to a BBS.
0: New stuff:
- A. New Computer crime: explain. dejaNews?
- B. British doctors vs British spies
- C. CMU to censor "obscene" newsgroups.
- D. Canadian cases--child porn text story got a
BBS in trouble. Computer created child porn pictures got someone
else in trouble..
I. Review:
- A. Filling out the Rivkin story. He was setup for
the second charge because there were legal problems with
first.
- B. One other way of profiting by a virus--be in
the fixit business.
II. Low tech computer crime:
- A. name vs number story. Caught bc W-2 too high.
Solution.
- B. Criminal on work furlough--job in accounts
payable of city govt.
- 1. Duplicate vouchers, changed addresses
- 2. Cashed and converted into gold coins etc.
- 3. Eventually spotted duplicates.
- 4.Trying for sixteen million,caught pastone.
- B. Sabotage out of boredom story.
- C. ATM repair scam.
III. Another extortion--all tapes and backups and
backups of ... Caught on payoff.
IV. Stealing services from
ex-employer. Thought it was all right--they
would have ...
- Milling machine tapes.
- Was helping their customers, plus providing free
services for new employer that would have been offered free as
marketing.
- Probation and restitution
- Lesson about psychology of morals.
V. Leslie Lynn Doucette. Hacker service
industry.
- 1. Gets a number from someone over the phone.
- 2. Check it by hacking or calling a chat line phone number.
- 3. PBX has a long distance from 800 option. Use for
communication.
- 4. Voice mail computers as bulletin boards.
- a. Hacker boards were known, monitored--credit cards could
be cancelled.
- b. Find an empty box in a voice mail system, use it. Low
security because ...
- c. Leave lists of verified codes.
- d. Subordinates pick up, get money, send to her.
- 5. Real estate man found his voice mail system overloaded
with free riders.
- 6.Secret Service had tip about Doucette from Canada
(convicted, left)
- 7. Informants said Chicago.
- 8. Dialed Number Recorder on her phone.
- 9. Then on her 5 major subordinates.
- 10. Plea Bargain, 27 months, 1990. claimed $1.6 million in
losses.
VI. Market in Computer Crime?
- 1. Need division of labor.
- 2. Offshore data havens.
- 3. Market in live credit cards.
- 4. In free long distance: sidewalk enterprise,
free calls on pay or cellular.
- 5. $1.4 million in 4 days against one PBX.
$10/call?
VII. Check Kiting story.
- A. Explain kiting
- B. Use of computer
- C. Crash--and crash.
VIII. Card counting? Not
illegal. $10,000 materials, $390,000 labor.
Four teams, eleven people. In 22 days made $130,000.
Two systems captured, FBI reported just a computer. No
indictments.
VIII. How much computer
crime? Total of 1000 1958-1981 tabulated.
flat abt 1973. Financial>Govt>Student
VIII. Suppose you discover a hole in someone's
security... Responsible you.
- A. Real example.
- B. Problem--practical
- C. legal.
IX. Protecting one program from
another.
- A. Old version--multiuser mainframe
- B. New version--many progams in micro.
- C. How to do it.
- D. Market inside computer approach. Some programs
might cheat.
I. Bank slip story.
II. How to predict the future:
- A. The non-obvious consequences of what is
already in place.
- 1. Heinlein and the car.
- 2. word processor->end of secretary as typist. Irvine.
- B. The consequences of what is already
invented.Computer --> word processor
- C. What will be invented.
III. What is already in place: How is the
(computer) world different from 15-20 years ago (and most of our
crime experience)?
- A.then it was small number of big machines, many
handling large sums, lots of custom software, lots of people who
knew little about computers. Inverted hierarchy
age/expertise.
- B. Lots of small machines.
- 1. Running almost entirely off-the-shelf software
- 2. Many of them networked
- 3. Most of them controlling small amounts of wealth.
- 4. Few with encryption or digital signatures.
- 5. Many belonging to people who do not know very much about
computers
- C. Many bigger machines
- 1. With lots of communication, but ...
- 2. Firewalls where necessary.
- 3. Encryption and digital signatures where necessary, or
rapidly coming online.
- 4. Mostly off the shelf software?
- 5. Mostly run by people who know a fair amount about
computers?
IV. Implications for computer crime:
- A. Then: Programming scams. Data entry scams.
Hacking mostly for fun. Viruses (the only micro crime) mostly for
fun. Piracy. Use as a tool. Theft of services. Information
theft.
- B. How do those come across to now?
- 1. Programming scams only via bogus versions, illicit
modification.
- 2. Data entry over the net?
- 3. A thousand hackers wouldn't get noticed.
- 4. Tool and piracy--always. Muchmore.
- 5. Theft of services? Maybe use of lots of machines at
once???
- 6. Information theft?
- a. Via net, less because of firewalls, but ...
- b. Still via EMail interception. Computerised searching?
- c. Still from employees--unchanged.
- C. New possibilities:
- 1. High volume low amount scams, such as:
- a. Blackmail scheme suggested (alternate
explanation--also volume scam)
- b. Chain letters.
- c. Any fraud where cheap communications are key--lots of
them.
- d. But payoff is still a weak link, on the otherhand ...
- e. Most of it can be done secretly abroad, by experts.
- 2. Modifying or fooling off-the-shelf software. Again high
volume with computer bill paying and the like.
- 3. Large amount crimes that depend in holes in the
encryption etc. scheme.
- a. This is what computers were--inverted hierarchy.
- b. Hackers again? Clever tech + social engineering.
- 4. Information theft
- a. via equivalent of wiretap for EMail.
- b. via equivalent of hacking--get into the unprotected
computers. Who has info worth stealing?
- 5. International net?
- a. Using one country to do things illegal in another
(Porn: Neth->US, Heresy U.S.->Iran, Piracy:
China->US)
- b. Violating labor laws on a large scale. Child etc.
- c. Coordinating a crime outside of the jurisdiction.
Division of labor.
- D. Are the old crimes dead?
- 1. Data entry an option for a while, but ...
- a. as more done over the net by originator ... .
- b. shifts to a network interception crime instead.
- 2. hacking gets lost in forest
- a. Unprotected small machines in the tens of millions,
small amoutns at stake.
- b. Big machines protected.
- c. What about--Foreign geeks 20 hrs before a screen?
- 3. Contract programming scams down--off the shelf.
- 4. Theft of services? Unlikely. Processing power so cheap,
stolen is inconvenient.
- a. What about foreigners,
- b. With net connections, without access to bigger
machines?
- c. Kids ditto? Unlikely.
V. Next step: Crime in a world of strong
privacy.
I. More suggestions on crimes in the new
world?
- A. Crimes depending on large numbers,
connectivity?
- 1. What crimes became possible with the phone? Some con
games?
- 2. What about victimless crimes?
- a. Gambling is easy with digital cash
- b. Phone sex from outside the U.S.
- 3. What about crime info? ANFO or Phonefreaking info to the
world?
- 4. harassment: Tell the world my VISA number. Phonecharge
number.
- 5. Blackmail--we won't do 4 if ... .
- 6. Blackmail: Information exchange. Automate it with real
info. How do you know it is real? Victim pays. If not, send it
to appropriate target (also provided by subcontractor).
- B. Depending on crypto ignorance?
- C. Other?
II. How do you prevent these?
- A. Passive.
- 1. Checksums on CDRom or equivalent? Other techno
protections?
- 2. Firewalls inside computers, bringing only data off the
net?
- 3. Software that requires active confirmation for cash
transfers?
- B. Active:
- 1. FTC or equivalent prowling the net? How do you find
perps?
- 2. Vigilantes?
- 3. Reliable data banks on scams?
- 4. Filters for rent--like surfwatch.
- 5. Fight info with info.
- C. Framework:
- 1. Digital signatures. 10 million 100 byte records on one
CDROM.
- 2. Sites that only accept signed and verified messages.
- 3. Auto encrypting stuff a la Netscape to prevent
interception/altering.
- 4. Caveat Creditor.
- D. Are there interesting niches? For
lawyers?
-
- Computer Crime Summary
I. Old technology:
- A. Small number of big machines, many handling
large sums, lots of custom software, lots of people who knew
little about computers. Inverted hierarchy age/expertise.
- B. Programming scams. Data entry scams. Hacking
mostly for fun. Viruses (the only micro crime) mostly for fun.
Piracy. Use as a tool. Theft of services. Information
theft.
- C. Problem of division of labor in illegal
markets.
II. Current technology: Crime based on what is
here now.
- A. Lots of small machines.
- 1. Running almost entirely off-the-shelf software
- 2. Many of them networked
- 3. Most of them controlling small amounts of wealth.
- 4. Few with encryption or digital signatures.
- 5. Many belonging to people who do not know very much about
computers
- B. Many bigger machines
- 1. With lots of communication, but ...
- 2. Firewalls where necessary.
- 3. Encryption and digital signatures where necessary, or
rapidly coming online.
- 4. Mostly off the shelf software?
- 5. Mostly run by people who know a fair amount about
computers?
- C. Old crimes?
- 1. Programming scams only via bogus versions, illicit
modification.
- 2. Data entry over the net?
- 3. A thousand hackers wouldn't get noticed.
- 4. Tool and piracy--always. Muchmore.
- 5. Theft of services? Maybe use of lots of machines at
once???
- 6. Information theft?
- a. Via net, less because of firewalls, but ...
- b. Still via EMail interception. Computerised searching?
- c. Still from employees--unchanged.
- D. New possibilities:
- 1. High volume low amount scams, such as:
- a. Blackmail scheme suggested (alternate
explanation--also volume scam)
- b. Chain letters.
- c. Any fraud where cheap communications are key--lots of
them.
- d. But payoff is still a weak link, on the otherhand ...
- e. Most of it can be done secretly abroad, by experts.
- 2. Modifying or fooling off-the-shelf software. Again high
volume with computer bill paying and the like.
- 3. Large amount crimes that depend in holes in the
encryption etc. scheme.
- a. This is what computers were--inverted hierarchy.
- b. Hackers again? Clever tech + social engineering.
- 4. Information theft
- a. via equivalent of wiretap for EMail.
- b. via equivalent of hacking--get into the unprotected
computers. Who has info worth stealing?
- 5. International net?
- a. Using one country to do things illegal in another
(Porn: Neth->US, Heresy U.S.->Iran, Piracy:
China->US)
- b. Coordinating a crime outside of the jurisdiction.
Division of labor.
- E. Defenses?
- 1. Firewalls etc. inside ordinary micro computers.
- 2. On line information providers about scams, etc.
- 3. Entrapment by law enforcement.
- 4. Mechanisms for checking data integrity--CDRom for
example.
- 5. Note that info can be distributed using digital
signatures from manufacturer.
- F. New legal issues?
- 1. Does software manufacturer have a cause of action
against software that alters covertly? against someone else for
negligence in permitting it?
- 2. International law issues:
- a. Minnesota equivalent.
- b. How active are we willing to be? Panama invasion?
- 3. Others?
II. Farther future: Strong Privacy
- A. Assumptions: Clipper and Son of Clipper fail.
Widespread digital signatures, public key encryption, remailers,
etc.
- B. Pirate archives:
- 1. Enforcement goes to the user level.
- 2. What if firms have anonymous employees?
- 3. Enforcement by contract.
- 4. Live without IP protection.
- C. Criminal firms with brand name
reputation:
- 1. Blackmail. Live carefully.
- 2. Assassination. Cui Bono.
- 3. Entrapment. That is why you need a reputation.
- D. Fight info with info? Entrapment. Scam
bank.
I. FBI wiretap proposal:
- A. It is a Federal register notice outlining
capacity requirements. under the Communications Assistance for Law
Enforcement Act (the "Digital Telephony" bill or "CALEA").
- B. How derived:The capacity figures that are
presented in this initial notice were derived as a result of a
thorough analysis of electronic surveillance needs. Information
regarding electronic surveillance activities for a specific time
period was obtained from telecommunications carriers, law
enforcement, U.S. District Courts, State Courts, State Attorneys
General, and State District Attorneys to establish a historical
baseline of activity. The historical baseline of electronic
surveillance activity was determined after examination of both the
location and occurrence of each electronic surveillance reported.
The historical baseline was then analyzed to derive the total and
simultaneous electronic surveillance activity by switch and within
specific geographic areas. Future capacity needs were then
determined after consideration of the impact of demographics,
market trends, and other factors on the historical
baseline.
- C. Historical numbers--not provided by
FBI:
- 1. Wiretaps, Fed&State 1994, Title III: 1154
- 2. Foreign Intelligence Serveilance act: 576
- 3. Pen Registers: 17,410
- 4. Trap and trace: 4,789 (records originating numbers)
- 5. TotalAbout 24,000 total during the year
- D. Requirements: Three geographical categories:
Actual (3/4 yrs)and maximum
- 1. Cat I+II=25% of equipment.
- 2. All suveillance, incl pen reg.
- 3. I:0.5% /1%
- II: .25/.5
- .025/.25
- 4. EPIC estimates--about 30,000 simultaneous taps or
traces.
- E. Freeh: Deputy Attorney General Gorelick said
this morning: "Let me make it perfectly clear, there is no
intention to expand the number of wiretaps or the extent of
wiretapping."
- F: Recent past: State fairly stable, Fed 1992:
340, 1994: 554
- 1. The vast majority of cases where wiretaps are approved
involve drug trafficking.
- 2. In 1994, seventy-six percent of all orders were for
narcotics investigations and eight percent each for
racketeering and gambling.
II. Clipper II .
- A. 64 bit, private but there must be key escrow
with govt certified escrow agents.
- 1. Is 64=56? For 64 bit DES, yes.
- 2. If they manage to limit encryption to "64 bit" DES, it
matters
- 3. By a factor of 256.
- B. No interoperability with unescrowed
programs.
- C. Proposal recently was rejected by an industry
group with lots of big players.
III. Default rule for your
info: Avrahami claims it requires express
consent.
IV. S. 1360: Medical Records Privacy
Act
- A. Must provide info to subject unless endangers
life or confidentiality or only used for internal admin
purposes.
- B. Subject may ask for a correction, if not
granted insist on having statement of disagreement
included.
- C. Must maintain record of disclosures.
- D. May disclose only if purpose compatible with
and related to purpose for which obtained
- E. Disclose persuant to authorisation by
subject,
- F. May disclose for creation of nonidnetifiable
info without authorisation to certified info services. They must
remove id.
- G. May disclose for health research, which must
remove identification unless internal process determines info is
needed.
- H. May disclose to court etc., or subpoena with
notice to subject.
- Objections--preempts state confidentiality and
most common law. Dangerous to HIV patients?
V. Net censorship opposed by CATO, ACLU,
Brookings
VI. Monitoring vs spoofing: Netscape vs public key
phone book.
VII. Drop-in encryption?
- A. Apple story.
- B. CAPI (Cryptographic Applications Programming
Interface). NATO initiative. NSA mixed reviews.
VIII. Rumour boards vs credible boards?
IX. Should blackmail be illegal?
- A. No--mutual benefit. Wrong.
- B. Yes--hinders law enforcement. wrong.
- C. Maybe--private enforcement, but ...
- 1. Wrong laws?
- 2. Or prevention of social fraud.
- 3. Is the world better if social fraud is easier or harder?
Deja News, etc.
Two recent academic cases
I. Cornell: http://joc.mit.edu/~joc/
- A. Did they do anything wrong? Should there be a
rule against it?
- 1. Mailed to people not offended
- 2. Is nabokov at fault if I put a copy of Lolita in the
local elementary school library, in the hope of getting him in
trouble?
- B. Did Cornell do anything wrong? How do you
interpret their action--was the punishment really
voluntary?
II. Caltech case: Mercury.
- A. Based on story.
- 1. Aquitted, and ...
- 2. Terrible evidence, including alibi, so ...
- 3. Cal techis wrong
- 4. Puzzling--they should know EMail evidence is worthless.
- B. Attorney's version.
- 1. Acquitted because.
- 2. Password changed after they broke up, and ...
- 3. Some material on Cray, only 30 students on campus had
access.
- 4. EMail sent by devious method, but keystroke monitor
traced it.
- 5. Mostly non EMail related evidence, including letter to
her family in China warning that they would get daughter back
in a coffin if she did not marry him.
- C. Thesis sponsor's version.
- 1. lovers for some months, planned to marry, broke up,
brief quarrel.
- 2. Still friends for months after, he helped her do her
research on his account with his software.
- 3. Jan 1 met, discovered living with new boyfriend,
quarrel.
- a. Jinsong: Threat to makeJiajun's guilty secrets
public. Context-PRC is a sexually conservative culture.
- b. Jiajun and Bo: Duel, gun talk.
- c. Jinsong: Counterthreat to prosecute
- 4. That night, obscene EMail forged b y Jinsong's account
from Bo to Jiajun
- 5. Jan 2-3rd, complaint, account frozen,
- 6. Jinsung EMails Bo, now his responsibility.
- 7. Jan 3rd, alibi'ed case, EMail to Bo with threats.
- 8. 1/4 J&B complain to Caltech of harassment from
August, showed gun permit, threats.
- 9. 1/6 complaint to police, rape added.
- 10. 1/6 bogus EMail from friend of Bo out of town. Well
informed.
- 11. Jinjong held on 150,000 bail, physically slight (15
size), 5 1/2 months. Bo and Jiajun refuse to cooperate with
lowering bail. Friends pay for an attorney.
- 12. Keystroke evidence brought out by defense.
- 13. Rape inconsistency.
- 14. Acquittal without reaching question of truthof threats.
- 15. Biased hearing, dean who had intervened in trial
against Jinjong.
- D. Attorney. Bo and brother tried to pressure
other Chinese students, with threats. Active in prosecution,
prepared a 20 page pamphlet including stories about unrelated
chinese murder cases. Grammar argument.
- E. My points:
- 1. Attorney tried to keep me from hearing the other side. 3
vs 7 volumes
- 2. Did not mention evidence re password.
- 3. Motive.
- 4. Gun permit.
- 5. Whole story of the relationship inconsistent.
- 6. To believe them, must believe that his fellow students
are lying.
- 7. Character Evidence.
- 8. Will get more, put on reserve.
I. Would digital signatures have solved this
problem?
- A. Only if Jinsong did not tell Jianjing
- B. But he would have--he trusted her, told her
his password, loved her, ...
- C. Could one create institutions such that he
would not have?
- 1. Strongly felt customs of privacy? That strong?
- 2. Mechanical? Card that he must have, cannot copy, will
not loan out.
- 3. Fingerprint or retina or ... ? with stored private key
that owner cannot get?
- D. But it would eliminate the forged message to
Jiajun, since Jinsong would not have been able to forge Bo's
digital signature, so could not have expected to persuade anyone
the message was genuine.
II. Does Jianjing have adequate motive?
- A. Different standards of sexual behavior.
- 1. We don't know how different, but ...
- 2. They don't seem to have been very careful to keep it
secret, but ...
- 3. It sounds as though she admitted to as little as
possible sex in the trial.
- B. Might the threat have been something else
(secrets about her family?)
- 1. Foreign communities here may care a lot about issues not
central to us.
- 2. Perhaps she had told him something that would get her
family in trouble with the govt, or ...
- 3. That her family spied for the government, which might
get her in trouble with other Chinese here, or...
- 4. ???
III. Lessons of the case for us:
- A. Risks of anti-harassment, stalking, etc.
laws.
- 1. Everything that makes it easy to convict the guilty
makes it easier to frame the innocent.
- 2. The more severe the penalty, the more serious the threat
available
- 3. Here Jinsong was punished severely despite being
acquitted--5 1/2months+$20,000.
Course Summary:
I. Computer Crime.
- A. Fitting it in ...
- 1. By description? Moving things around vs destroying or
modifying property?
- 2. By analogy? But then what analogy you use matters a lot.
- a. "intruding" or "defrauding" or "talking" for hacking
into a computer.
- b. Where do cyberspace events happen? The Thomases.
- c. Is my private key my castle?
- d. Is breaking copy protection burglar's tools? White
Paper v old cases.
- e. If stealing time on a computer stealing? Stealing
services? Lund v VA
- 3. Is information property?
- a. Copyright says no (Dowling), U.S. v
Riggs says yes.
- b. How valued. Bell South.
- c. Trade secret as an old version of this
- d. But a lot more value as easily copied, transferred
info than there used to be. Is a difference in degree a
difference in kind?
- e. Controlling info cuts close to Fdm of speech issues:
Hacker Crackdown. Old version: AIDS book.
- f. Civil vs criminal? Why does criminal law exist?
- 4. Common sense problems: NYT story. Randal Schwartz, PERL
- a. Big name network person, subcontract sysop for part
of Intel, then different part,
- Ran Crack against his old part, found Deacon, moved
password list, broke 48/600
- 5 yrs probation, $170,000 legal fees, $72,000 maybe in
damages.
- b. Maybe a replay of U.S. v Seidlitz, although clearer.
- c. Also Unix/hacker crackdown issue, also fuzzier.
- d. Is this simply jury ignorance, or ...
- e. Intel goofed, did not want to admit it, or
- f. sending a signal, or ...
- g. Intel's left hand does not know what its right hand
is doing.
- h. Digression--encoded password files.
- 5. Also analogy problem--Intel claimed it was "theft" to
move file from one Intel computer to another.
- 6. By legislation.
- a. Many states now make altering computer program,
files, etc. criminal.
- b. Federal ECPA
- 7. By realspace property and contract law? Specify when you
connect.
- B. Special nature?
- 1. People working with things they don't understand
- 2. And their status inferiors do.
- C. Old computer crime: talked it to death.
- 1. Data entry frauds.
- 2. Hacking--persistence, social engineering, technical
skill.
- a. Largely remote con game.
- b. Like telephone frauds
- c. And mail frauds.
- d. Plus victim ignorance.
- 3.
- D. New Computer crime
- E. World of Strong Privacy
- 1. It's here in embryo.
- a. Digicash and Mark Twain bank are now in business.
- b. As is Verisign
- c. netscape offers encryption support.
- d. My Newsreader supports anonymous remailers.
- e. ABA has its digital signature guidelines at
http://www.intermarket.com/edl
- f. NetBack (http://www.netback.com) Digital
notary--timestamps and stores documents
- 2. Prevent it--Clipper 2, export control
- 3. Control it. Require digital signatures to post. Real
internet driver's license?
- 4. Or live with it.
- a. Use secrecy. My recent Usenet story.
- b. Replace IP where you can't protect it.
- c. Virtues of digital signatures--consensual relations
easier than before.
- d. Cancelmoose vs Spam, Flame vs Spam, etc.
- 5. Ask--can the objective be accomplised?
- a. What is the point of shutting down the Thomases
- b. If they just move to Amsterdam?
- c. What is it worth to cut the lines to Amsterdam?
II. Privacy
- A. Against legal data bases
- 1. Is privacy a good thing? Why?
- 2. What about freedom of contract, if they see it its
theirs?
- 3. Current rules more restrictive, but ...
- 4. How well enforced?
- 5. What would a world of less privacy be like? Web may give
it deliberately. Linking my lives.
- B. Against data spying: Encryption, if
permitted
- C. Against data interception: Legal limits,
encryption if permitted.
IV. Random number issue--how to choose a
password.
- A. What is relevant is randomness to everyone but
you--can they guess it.
- B. So the perfect selected password is obvious to
you, but nobody else.
- C. PGP passphrase.
V. Digital time stamping idea.
VI. Cornell--what makes it
private/public?
Back to the CCP home page