Analytical Methods
for Lawyers
v Idea of the Course
¯ Brief survey of lots of areas useful to lawyers
¯ Many of which could be a full course--my L&E
¯ Enough so that you won't be lost when they come up, and É
¯ Can learn enough to deal with them if it becomes necessary.
v Mechanics
¯ Reading is important
¯ Discussion in class
¯ Homework to be discussed but not graded--way of testing yourself
¤ Prefer handout hardcopy or on web page? URL on handout
¯ Midterm? First time.
v
Topics
¯
Decision Analysis
¯
Game Theory
¯
Contracting: Application of Ideas
¯
Accounting.
¯
Finance
¯
Microeconomics.
¯
Law and Economics.
¯
Statistics.
¯ Multivariate
Statistics: Untangling one out of many causes. Death penalty
v First Topic:
Decision Analysis
¯ Way of
formally setting up a problem to make it easier to decide
¯ Typically
¤ Make a
choice.
¤ Observe the
outcome, depends partly on chance
¤ Make another
choice.
¤ Continue
till the end, get some cost or benefit
¤ Want to know
how to make the choices to maximize benefit or minimize cost
¯ Simple
Example: Settlement negotiations
¤ Accept
settlement (known result) or go to trial
¤ If trial win with some probability and
get some amount, or lose and have costs
¤ Compare settlement offer to average
outcome at trial, including costs.
¯ Fancy example: Hazardous materials
disposal firm
¤ You suspect employees may have cut some
corners, violated disposal rules
¤ First choice: Investigate or don't.
á
If you
don't, probably nothing happened (didn't violate or don't get caught)
á
If you do,
some probability that you discover there is a problem. If so É
¤ Conceal or report to EPA
á
If you
conceal, risk of discovery--greater than at previous stage (whistleblowers)
á
If you
report, certain discovery but lower penalty
¯ In each case, how do you figure out what
to do? Two parts:
¤ If you knew all the probabilities and
payoffs, how would you decide (Decision Analysis)
¤ What are the probabilities and payoffs,
and how do you find them?
¯ Simple case again: Assuming numbers
¤ First pass
á
Settlement
offer is $70,000
á
Trial cost
is $20,000
á
Sure to win
á
Tree
diagram
á
Lop off
inferior branch--easy answer
¤ Second pass: As above, but 60% chance of
winning
á
Square for
decision, circle for chance node
á
On average,
trial gives you $40,000
á
Is that the
right measure?
á
If so,
inferior. Lop off that branch
á
Settle
¤ Risk aversion
á
If you are
making similar decisions many times, expected value.
á
If once,
depends on size of stakes.
¯ Where do the numbers come from?
¤ Alternatives: Think. Talk to client,
colleagues, É Think through
alternatives.
á
Partly your
professional expertise
á
Forces you
to think through carefully what the alternatives are.
¤ Probabilities
á
Might have
data--outcome of similar cases in the past. Audit rate.
á
Generate
it--mock trial. Hire an expert.
á
By
intuition, experience. Interrogate. What bets would I accept?
¤ Payoffs
á
Include
money--costs, profits, fines, É Past cases, experts, É .
á
Reputational
gains and losses
á
For an
individual, moral gains and losses? Other nonpecuniary?
¯ Sensitivity analysis
¯ (Land Purchase Problem?)
¯ Is ethics relevant?
¤ Criminal trial--does it matter if you
think your client is guilty?
¤ EPA--does it matter that concealing may
be illegal. Immoral?
á
What if not
looking for the problem isn't illegal, but É
á
Finding and
concealing is?
v
Query re Becca
v Mechanics
¯ Office Hours
handout
¯ Everyone
happy with doing stuff online?
v Review:
Points covered
¯ Basic
approach
¤ Set up a
problem as
á
Boxes for choices
á
Circles for chance outcomes
á
Lines joining them
á
Payoffs, + or -, and probabilities.
¤ Calculate
the expected return from each choice, starting with the last ones
á
Since the payoff from one choice
á
May depend on the previous choice or chance.
¤ If one
choice has a lower payoff than an alternative at the same point, lop that
branch
¤ Work right
to left until you are left with only one series of choices.
¯ Complications
¤ Expected
return only if risk neutral
¤ You have to
work out the structure, with help from the client and others
¤ Estimate the
probabilities, and É
¤ Payoffs, not
all of which are in money.
¯ Sensitivity
analysis to find out whether the answer changes if you change your estimates.
v Handout
problems
¯ Settle or go
to trial
¯ Which
contract to offer
¤ Easy answer
for the team
¤ Note that we
have implicitly solved the player's problem too.
á
Upper contract, if he has back pain, playing costs him $2 million,
gets him nothing, not playing neither costs nor gets, so don't play
á
Lower contract, if he has back pain, playing costs him $2 million,
gets him $10 million. Not playing gets and costs nothing. So he plays.
¤ Note also a
third option, that we didn't mention--no contract.
á
Better than the first
á
Could change the numbers to make it better than the second
á
Demonstrating that one has to figure out the structure of the
problem.
v Questions?
v More book
problems
¯ Land
purchase problem
v
v Game Theory
Intro: Show puzzling nature by examples
¯ Bilateral monopoly
¤ Economic case--buyer/seller, union/employer
¤ Parent/child case
¤ Commitment strategies
á
In economic
case
á
Aggressive
personality.
1/17/06
v Move to front of the room
v Strategic Behavior: The Idea
¯ A lot of what we do involves optimizing
against nature
¤ Should I take an umbrella?
¤ What crops should I plant?
¤ How do we treat this disease or injury?
¤ How do I fix this car?
¯ We sometimes imagine it as a game against
a malevolent opponents
¤ Finagle's Law: If Something Can Go Wrong,
It Will
¤ "The perversity of inanimate
objects"
¤ Yet we know it isn't
¯ But consider a two person zero sum game,
where what I win you lose.
¤ From my standpoint, your perversity is a
fact not an illusion
¤ Because you are acting to maximize your
winnings, hence minimize mine
¯ Consider a non-fixed sum game--such as
bilateral monopoly
¤ My apple is worth nothing to me (I'm
allergic), one dollar to you (the only customer)
¤ If I sell it to you, the sum of our gains
is É ?
¤ If bargaining breaks down and I don't
sell it, the sum of our gains is É
?
¤ So we have both cooperation--to get a
deal--and conflict over the terms.
¤ Giving us the paradox that
á
If I will
not accept less than $.90, you should pay that, but É
á
If you will
not offer more than $.10, I should accept that.
¤ Bringing in the possibility of bluffs,
commitment strategies, and the like.
¯ Consider a many player game
¤ We now add to all the above a new element
¤ Coalitions
¤ Even if the game is fixed sum for all of
us put together
¤ It can be positive sum for a group of
players
¤ At the cost of those outside the group
v Ways of representing a game
¯ Like a decision theory problem
¤ A sequence of choices, except that now
some are made by player 1, some by player 2 (and perhaps 3, 4, É)
¤ May still be some random elements as well
¤ Can rapidly become unmanageably
complicated, but É
¤ Useful for one purpose: Subgame Perfect
Equilibrium
¤ Back to our basketball player--this time
a two person game
¤ But É Tantrum/No Tantrum game
¤ So Subgame Perfect works only if
commitment strategies are not available
¯ As a strategy matrix
¤ Works for all two player games
¤ A strategy is a complete description of
what the player will do under any circumstances
¤ Think of it as a computer program to play
the game
¤ Given two strategies, plug them both in,
players sit back and watch.
¤ There may still be random factors, but É
¤ One can define the value of the game to
each player as the average outcome for him.
¯ Dominant Solution: Prisoner's Dilemma as
a matrix
¤ There is a dominant pair of
strategies--confess/confess
á
Meaning
that whatever Player 1 does, Player 2 is better off confessing, and
á
Whatever Player
2, does Player 1 is better off confessing
á
Even though
both would be better off if neither confessed
|
Baxter |
||
Confess |
Deny |
||
Chester |
Confess |
10,0 |
0,15 |
Deny |
15,0 |
1,1 |
¤ How to get out of this?
á
Enforceable
contract
¬ I won't confess if you won't
¬ In that case, using nonlegal mechanisms
to enforce
á
Commitment
strategy--you peach on me and when I get out É
¯ Von Neumann Solution
¤ Von Neumann proved that for any 2 player
zero sum game
¤ There was a pair of strategies, one for
player A, one for B,
¤ And a payoff P for A (-P for B)
¤ Such that if A played his strategy, he
would (on average) get at least P whatever B did.
¤ And if B played his, A would get at most
P whatever he did
¯ Nash Equilibrium
¤ Called that because it was invented by
Cournot, in accordance with Stigler's Law
á
Which holds
that scientific laws are never named after their real inventors
á
Puzzle: Who
invented Stigler's Law?
¤ Consider a many player game.
á
Each player
chooses a strategy
á
Given the
choices of the other players, my strategy is best for me
á
And
similarly for everyone else
á
Nash
Equilibrium
¤ Driving on the right side of the road is
a Nash Equilibrium
á
If everyone
else drives on the right, I would be wise to do the same
á
Similarly
if everyone else drives on the left
á
Multiple
equilibria
¤ One problem: It assumes no coordinated
changes
á
A crowd of
prisoners are escaping from Death Row
á
Faced by a
guard with one bullet in his gun
á
Guard will
shoot the first one to charge him
á
Standing
still until they are captured is a Nash Equilibrium
¬ If everyone else does it, I had better do
it too.
¬ Are there any others?
á
But if I
and my buddy jointly charge him, we are both better off.
¤ Second problem: Definition of Strategy is
ambiguous. If you are really curious, see the game theory chapter
in my webbed Price Theory
v Solution Concepts
¯ Subgame Perfect equilibrium--if it exists
and no commitment is possible
¯ Strict dominance--"whatever he does
É" Prisoner's Dilemma
¯ Von Neumann solution to 2 player game
¯ Nash Equilibrium
¯ And there are more
1/19/06
v A simple game theory problem as a lawyer
might face it:
You represent the
plaintiff, Robert Williams, in a personal injury case. Liability is fairly
clear, but there
is a big dispute over damages. Your occupational expert puts the plaintiffÕs
expected future losses at $1,000,000, and the defendantÕs expert estimates the
loss at only $500,000. (Pursuant to a pretrial order, each side filed
preliminary expert reports last month and each party has taken the deposition
of the opposing partyÕs expert.) Your experience tells you that, in such a
situation, the jury is likely to split the difference, awarding some figure
near $750,000.
The deadline for
submitting any further expert reports and final witness lists is rapidly
approaching. You
contemplate hiring an additional expert, at a cost of $50,000. You suspect that
your additional expert will confirm your initial expertÕs conclusion. With two
experts supporting your higher figure and only one supporting theirs, the
juryÕs award will probably be much closer to $1,000,000 — say, it would
be $900,000.
You suspect,
however, that the defendantÕs lawyer is thinking along the same lines. (That
is, they could
find an additional expert, at a cost of about $50,000, who would confirm their
initial expertÕs figure. If they have two experts and you have only one, the
award will be much closer to $500,000 — say, it would be $600,000.)
If both sides hire
and present their additional experts, in all likelihood their testimony will
cancel out,
leaving you with a likely jury award of about $750,000. What should you advise
your client with regard to hiring an additional expert?
Any other ideas?
Set it up as a
payoff matrix
If neither hires
an additional expert, plaintiff receives $750,000 and defendant pays $750,000?
If plaintiff
hires an additional expert, plaintiff receives $850,000 and defendant pays
$900,000
If defendant
hires an additional expert, plaintiff receives $600,000 and defendant pays
$650,000?
If both hire
additional experts, plaintiff receives $700,000 and defendant pays $800,000?
|
Defendant: Doesn't hire |
Defendant: Hires |
Plaintiff: Doesn't Hire |
750, -750 |
600, -650 |
Plaintiff: Hires |
850, -900 |
700, -800 |
What does
Plaintiff do?
What does Defendant
do?
What is the
outcome?
Can it be
improved?
How?
v Game Theory: Summary
¯ The idea: Strategic behavior.
¤ Looks like decision theory, but
fundamentally different
¤ Because even with complete information,
it is unclear
á
What the
solution is or even
á
What a
solution means
¤ With decision theory, there is one person
seeking one objective, so we can figure out how he can best achieve it.
¤ With game theory, there are two or more
people
á
seeking
different objectives
á
Often in
conflict with each other
¤ A solution could be
á
A
description of how each person decides the best way to play for himself or
á
A
description of the outcome
¯ Solution concepts
¤ Subgame perfect equilibrium
á
assumes no
way of committing
á
No
coalition formation
¬ In the real world, A might pay B not to
take what would otherwise be his ideal choice--
¬ because that will change what C does in a
way that benefits A.
¬ One criminal bribing another to keep his
mouth shut, for instance
á
But it does
provide a simple way of extending the decision theory approach
¬ To give an unambiguous answer
¬ In at least some situations
¬ Consider our basketball player problem
¤ Dominant strategy--better against everything. Might not
exist in two senses
á
If I know
you are doing X, I do Y—and if you know I am doing Y, you do X. Nash
equilibrium. Driving on the right. The outcome may not be unique, but it is
stable.
á
If I know
you are doing X, I do Y—and if you know I am doing Y, you don't do X.
Unstable. Scissors/paper/stone.
¤ Nash equilibrium
á
By freezing
all the other players while you decide, we reduce it to decision theory for
each player--given what the rest are doing
á
We then
look for a collection of choices that are consistent with each other
¬ Meaning that each person is doing the
best he can for himself
¬ Given what everyone else is doing
á
This assumes
away all coalitions
¬ it doesn't allow for two ore more people
simultaneously shifting their strategy in a way that benefits both
¬ Like my two escaping prisoners
á
It also
ignores the problem of how to get to that solution
¬ One could imagine a real world situation
where
¯ A adjusts to B and C
¯ Which changes B's best strategy, so he
adjusts
¯ Which changes C and A's best strategies É
¯ Forever É
¬ A lot of economics is like this--find the
equilibrium, ignore the dynamics that get you there
¤ Von Neumann solution aka minimax aka saddlepoint aka É.?
á
It tells
each player how to figure out what to do, and É
á
Describes
the outcome if each follows those instructions
á
But it
applies only to two person fixed sum games.
¤ Von Neumann solution to multi-player
game (new)
á
Outcome--how
much each player ends up with
á
Dominance:
Outcome A dominates B if there is some group of players, all of whom do better
under A (end up with more) and who, by working together, can get A for
themselves
á
A solution
is a set of outcomes none of which dominates another, such that every outcome
not in the solution is dominated by one in the solution
á
Consider,
to get some flavor of this, É
¤ Three player majority vote
á
A dollar is
to be divided among Ann, Bill and Charles by majority vote.
¬ Ann and Bill propose (.5,.5,0)--they
split the dollar, leaving Charles with nothing
¬ Charles proposes (.6,0,.4). Ann and
Charles both prefer it, to it beats the first proposal, but É
¬ Bill proposes (0, .5, .5), which beats
that É
¬ And so around we go.
á
One Von
Neumann solution is the set: (.5,.5,0), (0, .5, .5), (.5,0,.5) (check)
á
There are
others--lots of others.
¤ Other approaches to many player games
have been suggested, but this is enough to show two different elements of the
problem
á
Coalition
formation, and É
á
Indeterminacy,
since one outcome can dominate other which dominates another which É .
¤ Almost enough to make you appreciate Nash
equilibrium, where nobody can talk to anybody so there is no coalition
formation.
v Applied Schelling Points
¯ In a bargaining situation, people may end
up with a solution because it is perceived as unique, hence better than
continued (costly) bargaining
¤ We can go on forever as to whether I am
entitled to 61% of the loot or 62%
¤ Whether to split 50/50 or keep bargaining
is a simpler decision.
¯ But what solution is unique is a function
of how people think about the problem
¤ The bank robbery was done by your family
(you and your son) and mine (me and my wife and daughter)
¤ Is the Schelling point 50/50 between the
families, or 20% to each person?
¤ Obviously the latter (obvious to me--not
to you).
¯ It was only a two person job--but I was
the one who bribed a clerk to get inside information
¤ Should we split the loot 50/50 or
¤ The profit 50/50--after paying me back
for the bribe?
¯ In bargaining with a union, when everyone
gets tired, the obvious suggestion is to "split the difference."
¤ But what the difference is depends on
each party's previous offers
¤ Which gives each an incentive to make
offers unrealistically favorable to itself.
¯ What is the strategic implication?
¤ If you are in a situation where the
outcome is likely to be agreement on a Schelling point
¤ How might you improve the outcome for
your side?
v Odds and Ends
¯ Prisoner's dilemma examples?
¤ Athletes taking steroids. Is it a PD?
¤ Countries engaging in an arms race
¤ Students studying in order to get better
grades?
¯ Is repeated prisoner's dilemma a
prisoner's dilemma?
¤ Suppose we are going to play the same
game ten times in succession
¤ If you betray me in round 1, I can punish
you by betraying in round 2
¤ It seems as though that provides a way of
getting us to our jointly preferred outcome—neither confesses.
¤ But É
¯ Experimental games
¤ Computers work cheap
¤ So Axelrod set up a tournament
á
Humans
submit programs defining a strategy for many times repeated prisoner's dilemma
á
Programs
are randomly paired with each other to play (say) 100 times
á
When it is
over, which program wins?
¤ In the first experiment, the winner was
"tit for tat"
á
Cooperate
in the first round
á
If the
other player betrays on any round, betray him the next round (punish), but cooperate
thereafter if he does (forgive)
¤ In fancier versions, you have evolution
á
Strategies
that are more successful have more copies of themselves in the next round
á
Which
matters, since whether a strategy works depends in part on what everyone else
is doing.
á
Some more
complicated strategies have succeeded in later versions of the tournament,
á
but tit for
tat does quite well
¤ His book is The Evolution of
Cooperation
v Threats, bluffs, commitment strategies:
¯ A nuisance suit.
¤ Plaintiff's cost is $100,000, as is
defendant's cost
¤ 1% chance that plaintiff wins and is
awarded $5,000,000
¤ What happens?
¤ How might each side try to improve the
outcome
¯ Airline hijacking, with hostages
¤ The hijackers want to be flown to Cuba
(say)
¤ Clearly that costs less than any serious
risk of having the plane wrecked and/or passengers killed
¤ Should the airline give in?
¯ When is a commitment strategy believable?
¤ Suppose a criminal tries to commit to
never plea bargaining?
¤ On the theory that that makes convicting
him more costly than convicting other criminals
¤ So he will be let go, or not arrested
v Moral Hazard
¯ This is really economics, not game
theory, but it's in the chapter
¯ I have a ten million dollar factory and
am worried about fire
¤ If I can take ten thousand dollar
precaution that reduces the risk by 1% this year, I
will—(.01x$10,000,000=$100,000>$10,000)
¤ But if the precaution costs a million, I
won't.
¯ insure my factory for $9,000,000
¤ It is still worth taking a precaution
that reduces the chance of fire by %1
¤ But only if it costs less than É?
¯ Of course, the price of the insurance
will take account of the fact that I can be expected to take fewer precautions:
¤ Before I was insured, the chance of the
factory burning down was 5%
¤ So insurance should have cost me about
$450,000/year, but É
¤ Insurance company knows that if insured I
will be less careful
¤ Raising the probability to (say) 10%, and
the price to $900,000
¯ There is a net loss
here—precautions worth taking that are not getting taken, because I pay
for them but the gain goes mostly to the insurance company.
¯ Possible solutions?
¤ Require precautions (signs in car repair
shops—no customers allowed in, mandated sprinkler systems)
á
The
insurance company gives you a lower rate if you take the precautions
á
Only works
for observable precautions
¤ Make insurance only cover fires not due
to your failure to take precautions (again, if observable)
¤ Coinsurance.
¤
¯ Is moral hazard a bug or a feature?
¤ Big company, many factories, they insure
á
Why? They
shouldn't be risk averse
á
Since they
can spread the loss across their factories.
¤ Consider the employee running one factory
without insurance
á
He can
spend nothing, have 3% chance of a fire
á
Or spend $100,000, have 1%--and make
$100,000 less/year for the company
á
Which is it
in his interest to do?
v Adverse Selection—also not really
game theory
¯ The problem: The market for lemons
¤ Assumptions
á
Used car in
good condition worth $10,000 to buyer, $8000 to seller
á
Lemon worth
$5,000, $4,000
á
Half the
cars are creampuffs, half lemons
¤ First try:
á
Buyers
figure average used car is worth $7,500 to them, $6,000 to seller, so offer
something in between
á
What
happens?
¤ What is the final result?
¯ How might you avoid this
problem—due to asymmetric information
¤ Make the information
symmetric—inspect the car. Or É
¤ Transfer the risk to the party with the
information—seller insures the car
¯ What problems does the latter solution
raise?
v To think about:
¯ Genetic testing is making it increasingly
possible to identify people at risk of various medical problems
¯ If you are probably going to get cancer,
or have a heart attack, and the insurance company knows it, insurance will be
very expensive, so É
¯ Some people propose that it be illegal
for insurance companies to require testing.
¯ What problems would that proposal raise?
1/24/06
v Genetic Testing:
¯ A. No Testing
¯ B. Customers can test; insurance
companies cannot condition rates on results
¯ Customers can test, insurance companies
can condition rates
¯ What happens?
v Contracts
¯ Why they matter
¤ A large part of what lawyers do is
drawing up and negotiating contracts
¤ In many different areas of the law
á
Employment
á
Partnerships
á
Sales
contracts
á
Contracts
between firms
á
Prenuptial
agreements and Divorce settlements É
¯ Why make a contract?
¤ Why deal with other people at all?
á
Because
there are gains to trade
á
The same
property may be worth more to buyer than seller
á
Different
people have different abilities
á
Specialization
and division of labor
á
Complementary
abilities
á
Risk
sharing
¬ An insurance contract not only transfers
risk
¬ It reduces it--via the law of large
numbers
á
Does a bet
due to different opinions count as gains from trade?
¤ A spot sale isn't much of a contract--why
anything else?
á
Because
performance often takes place over time
á
And the
dimensions of performance are more complicated than "seventeen bushels of
wheat."
á
Even a spot
contract might include details of quality--not immediately observable--and
recourse.
¯ Two Objectives in negotiating a contract
¤ Maximize the size of the pie
¤ Get as much of it as possible for your
client
¯ The second was covered in the previous
chapter
¤ If there is some surplus from the
exchange
á
Meaning
that you can both be better off with a contract
á
Than
without one
¤ Then you are in a bilateral monopoly
bargaining game
á
You are
both better off if you agree to a contract
á
But the
terms will determine how much of the gain each of you gets
¤ Where commitment strategies or control of
information might help
á
But at the
risk of causing bargaining breakdown
¬ Each of us is committed to getting at
least 60% of the gain, or É
¬ I have persuaded you that what you are
selling is only worth $10 to me, and it is worth $11 to you.
á
And the pie
goes into the trash
¯ This chapter is about the first--maximize
the size of the pie
¤ Any time you see a way of increasing the
size
¤ You can propose it, combined with a
change in other terms--such as price
¤ That makes both parties better off
¤ This point is central to the
chapter—if you are not convinced, we should discuss it now.
v Why incentives matter
¯ People often talk as if "more
incentive" was unambiguously good
¤ Gordon Tullock's auto safety device
¤ There is such a thing as too much
incentive
¤ What is the right incentive--for
anything?
¯ Consider a fixed price contract to build
a house
¤ Instead of spending $10,000 on roofing
material that lasts 20 years
¤ The builder spends $5,000 on material
that lasts 5 years
¤ After which the material must be replaced
at a cost of $12,000
¯ What is the sense in which this is a bad
thing?
¤ Compare to the case where the $5,000
material lasts 19 years.
¤ You want to set up the contract so he
won't use the cheap material in the first case, but É
¤ What about the second?
¯ How about the incentive not to breach a
contract?
¤ Should contracts ever be breached?
¤ When?
¤ How do you get that outcome?
¯ Enforceability and observeability
¤ Consider the marriage contract
á
Al-Tanukhi
story
á
Lots of
dimensions of performance are unobservable by an outside party
á
So a wife
who wants a divorce É .
á
You might
want to think about the general problem of marriage contracts
¬ Traditional: Divorce hard, gender roles
largely specified by custom
¬ Current: Divorce on demand, terms freely
negotiable day by day, mostly not enforceable
¬ Alternatives?
¬ What are the problems in designing a
marriage contract?
¬ We will return to that question
¤ Ideally, the contract specifies terms
that are observable
¤ Not always a sharp distinction
á
Sometimes performance
can be imperfectly observed--how well is this house built?
á
And one
might specify how to observe it--name the expert body whose standards you are
agreeing to.
¤ A second enforceability problem--what if
a party breaches and can't pay the damages?
¯ Reputation
¤ In today's discussion, we implicitly
assume that the only constraint on both parties is the contract itself
¤ In many cases that's not realistic. One
or both parties is a repeat player, and wants not only to stay out of court but
to keep customers and get more.
¤ We will return to that question later,
since it is relevant to how to structure contracts.
v Production Contracts—building a
house.
¯ One party pays the cost, gets the house,
the other builds it.
¯ Cost-plus or flat fee: Advantages and
disadvantages
¤ Why is there a "plus" in cost
plus?
á
If one
contractor will do the job for cost+$10,000, why won't another do it for
cost+$9,000?
á
Isn't the
"plus" something for nothing? $9,000 is better than zero.
¤ Is it "plus" or "plus
10%?"
¤ Why?
¯ Incentive to get inputs at the lowest
possible cost
¤ Flat fee: any savings goes to the contractor
á
So he wants
to minimize cost--including both price and his time and trouble
á
Which is
what you want him to do
á
Why do you
care about his time and trouble?
á
What would
happen if you set up the contract to force him to buy the input at the lowest
possible price (holding quality fixed--same brand of windows, say)? Imagine he
had to pay you a five thousand dollar penalty if you could show that,
somewhere, it was possible to buy an input for less than he paid?
¤ Cost plus: savings on price goes to you
á
But any
increase in time and trouble needed to get the lower price he pays
á
So he won't
try very hard to find a lower price
á
Even if it
would save you more than it costs him to do so
¤ Cost plus 10%?
á
Friedman's
rule for finding the men's room
á
And why it
sometimes doesn't work
¤ If you are using cost plus, how might you
control the problem?
¤ What are the problems you will face?
¯ Incentive to get inputs of the right
quality
¤ Do we always want the highest quality
inputs?
á
Do you only
eat at gourmet restaurants?
á
And buy the
highest quality car you can afford?
¤ Flat fee contract: Incentive of the
builder is É
á
To use the
least expensive inputs, whatever their quality
á
Because a
dollar saved is a dollar earned--for him
¤ Cost plus contract, he doesn't
care--extra quality comes out of your pocket
¤ Cost plus 10%?
¤ With a flat fee contract, how might you
try to control the problem?
¤ What problems arise in doing so?
¯ Uncertainty:
¤ Renegotiating the contract
á
Your client
forgot something important--try to prevent that in advance
á
Something
important changed.
á
You are
stuck in a bilateral monopoly with the builder
¬ The bargaining range is bounded on one
side by the terms of the initial contract--if he fulfills it he is in the clear
¬ And on the other side by the most you are
willing to pay for the change
¬ Which might be expensive
á
You could
include terms for changes in the contract
¬ Will that be easier with flat fee, cost
plus, cost plus 10%?
¬ Think about it from the builder's
standpoint.
¤ Risk bearing
á
What if
something changes that greatly increases the cost?
¬ Under flat fee, the builder swallows the
loss
¬ Under cost plus, you do
á
What if
something changes that greatly lowers the value to you?
¬ You contract to have land cleared and a
new factory built
¬ In 1929
¬ Risk allocation depends on the
contractual terms for breach
¬ Or on negotiation--again, with a
potential holdout problem
á
Why does
risk bearing affect the size of the pie?
¬ Because different parties have different
abilities to bear risk
¬ Because poor contract terms or bargaining
breakdown might lead to a smaller pie--the land gets cleared, the factory
built, and it sits empty until 1942.
Electronic Equipment Service Contract
Global
Consolidated Industries (GCI) has for years had an in-house electronic
equipment maintenance department. It has been responsible for providing
maintenance (such as periodic cleaning and lubrication of moving parts) and
repair (fixing machines when they break down) on thousands of printers,
photocopy machines, FAX machines, scanners, and so forth. The experience, in a
word, has been a disaster. On most days, secretaries can be seen running from
floor to floor and pushing in line to use other machines when theirs are
inoperative. Even the CEO is often heard screaming about memos being late,
meetings having to be rescheduled, and other headaches caused by out-of-order
equipment.
GCI has
decided that it is time to contract out for these services. As a member of
GCIÕs general counselÕs office, you have been called in to participate in the
contract negotiations with the outside service provider, Reliable Response
Repair (RRR).
RRR has
offered two contracts for your consideration. Under one contract, RRR receives
a flat rate per machine each contract year. (For example, there is a $200 per
year charge for a standard, mid-size photocopy machine.) Under this
arrangement, RRR is obligated to provide all necessary maintenance and to
repair broken-down machines promptly.
Under the
second contact, RRR is paid $75 per hour (plus parts) for all maintenance and
repair services. Under this arrangement as well, RRR is obligated to provide
all necessary maintenance and to repair broken-down machines promptly.
Explain the
pros and cons of each of the two contracts. Which seems best? Can you think of
additional terms that would improve it?
v
What is RRR's
incentive to do a good job of maintaining and fixing the machines under either
contract?
v
To do it
promptly?
v
What are GCI's
incentives under each contract? Why might RRR care about that?
v
Flat rate:
¯ RRR incentives
¤
Incentive to
maintain if it is cheaper than fixing
¤
Incentive to do
a good job of fixing, since if not they have to come back
¤
Promptness?
Only to the extent you can enforce that term
á
So you may want
to define it more precisely
á
Must show up
within 2 hours, fix within 4, or É
á
Penalty based
on how many hours machines are down each year, or É
á
Bonus for less
than 6 hours down time per machine
¤
Risk?
á
Very little
risk to GCI—they know how much they will pay
á
All of the risk
is on RRR—what if a machine has problems and keeps giving trouble?
á
But GCI is big
enough so that such effects should average out
¯ GCI incentives:
¤
Why do you
worry about those?
¤
GCI has little
incentive to take good care of machines, train people well, control whatever
inputs they provide that affect the chance of breakdown
¤
Little
incentive to hold down RRR's cost by, say, not using machines heavily at two in
the morning, or only asking for a technician to be sent when the problem is
serious
¤
GCI has reduced
incentive to buy good quality machines
á
So the contract
might specify machines presently on site, which RRR can inspect in advance
á
Or specify what
brands and models of new purchases are covered
v
Per hour:
¯ RRR incentives
¤
If per hour is
more than their real cost, a serious problem
á
Why maintain
when you get paid to fix?
á
Why fix well
when you get paid to come back?
¤
If per hour is
at their real cost, still have to monitor to make sure they are really working
that many hours
¤
Promptness
still a problem as above.
¯ GCI Incentives
¤
GCI now has an
incentive to buy good machines
¤
To take good
care of the machines
¤
Only to call a
tech when really needed
¤
And RRR might
charge more at 2 A.M. (modification of terms)
¯ Question: Does GCI have to use RRR under this
contract?
¤
If not, they
can use competition or the threat of it to control some of these problems, but
É
¤
A problem if
RRR is hiring extra maintenance personnel specifically to deal with GCI
repairs.
v
What if quality
of repair affects machine lifetime?
¯ Either way, RRR has little incentive to do a
good job in that dimension
¯ Perhaps GCI should lease the machines from
RRR, with repairs and maintenance included in the terms.
v
Perhaps what we
want is some of the cost on each party
¯ Per hour payment low enough to give RRR an
incentive to maintain machines, fix
them right, but É
¯ High enough to give GCI an incentive to do
what it easily can to avoid breakdowns.
¯ The same principle as coinsurance.
¤
Neither party
bears the full cost, so neither has as much incentive to prevent the problem as
we would like, but É
¤
Each bears
enough of the cost to make it in its interest to take most of the precautions
that ought to be taken.
Musician and Nightclub
Booking Arrangement
Your client, Jerry the Jazz musician, is becoming increasingly
well-known in the region. He has recently been offered a booking arrangement by
the Nightowl nightclub, the ritziest jazz bar in the city, for Tuesday nights.
They propose paying him $500 per appearance plus 10% of house profits. Because
they want to have the opportunity to use other musicians for variety, taking
advantage of out-of-town players who pass through, they are only willing to
guarantee Jerry 26 Tuesday night appearances over the course of the year. They
would give him one weekÕs notice with regard to each Tuesday, and he would be
obligated to appear when called.
Jerry tells you that he finds this offer attractive because it
would give him some stability in his income, something he has never had before.
On the other hand, he does not like the idea that the arrangement would
preclude his doing any other gigs on a Tuesday night (or out-of town gigs on
Mondays or Wednesdays); given his increasing reputation, he occasionally gets
great one-shot offers.
How do you advise Jerry regarding his contract negotiations with
Nightowl?
v
Gains
from trade
¯
Jerry
and Nightowl both reduce uncertainty
¯
Appearing
regularly at Nightowl probably benefits both
v
Problems
that might be fixable:
¯
Jerry
wants flexibility for out of town gigs
¯
How
to reduce the cost of that to Nightowl?
¤
If
he gives them a month advance notice, might be able to fill in
á
They
only plan to use him half the Tuesdays
á
Still
some cost--there might not be anybody good in town
á
But
perhaps less than the benefit to Jerry
¤
What
if he can get off if he finds a substitute?
á
How
do we define an adequate substitute?
á
Someone
they have hired before?
á
Someone
from a pre-agreed list?
¤
What
if he agrees to play a different day when he isn't there Tuesday?
¤
What
if he can take off a fixed number of Tuesdays by one month advance notice?
á
Hypothetical
numbers
á
Jerry
wants the right to block out 5 Tuesdays, a month in advance
á
Nightowl
thinks it costs them $400/Tuesday
¬
Hassle
of finding a replacement
¬
Risk
of lower quality
¬
Disappointment
of Tuesday customers who are fans of Jerry's.
á
Jerry
offers to accept $400 instead of 500 per appearance in exchange
á
Saves
Nightowl $100x26 Tuesdays=$2600, so they are better off
á
Jerry can
make $1000 more for out of town gigs, so gains $5000, loses $2600, so he is
better off too
v
Other
issues
¯
If Jerry
has the option, he might choose big nights--New Year's Eve--since he is getting
the same fee for every night from Nightowl, could get more elsewhere.
¤
How might
we solve that?
¤
Pay him
more for specified big nights?
¤
Or
specified big nights he doesn't have the option of taking off?
¤
Or, when he
notifies them, they bargain with him?
¯
Breach:
Under the initial contract, what if he accepts a Tuesday gig and then Nightowl
wants him that Tuesday?
¤
Liquidated
Damages? What does the contract say?
¤
What if he
is sick and can't play?
¤
Can Nightowl
tell the difference? Depends how far away the gig is?
¤
Breach
terms another way of getting flexibility
á
Liquidated
damages of $300 if be backs out with a month notice
á
$500 a
week's nnotice
á
$2000 if he
just doesn't show up
á
How should
we set the damages?
á
How about
calling in sick?
¤
Negotiation
another way of getting flexibility
á
Gets an
invitation for an out of town gig
á
Asks
Nightowl if they need him that night
á
If they do,
starts bargaining
á
Assymetric
information? How is it in Nightowl's interest to act?
á
Can Jerry
tell?
v
Incentive
issues
¯
For Jerry:
What are his incentives
¤
To do a
good job?
¤
To come
when he says he will?
¯
What are
Nightowl's incentives?
¤
To
advertise Jerry
¤
To run a
good club (why does he care?)
¤
To use him
often?
á
Should
there be different terms for other nights?
á
He isn't
committed--but doesn't get paid as much?
á
His time is
probably worth much less than $500 if he doesn't have a gig
v
Verifiability:
¯
Jerry gets
10% of profits--how measured?
¤
You are an
unscrupulous Nightowl owner--how do you hold down what you pay Jerry?
¤
Can he
tell?
¯
Are there
other ways of rewarding him related to how good a job he does?
¤
More easily
observed? Revenue--but also a bit tricky
¤
More
closely targetted on his contribution?
v
General
issues here are:
¯
Enlarging
the pie
¯
Via incentives
¯
Risk
bearing?
¯
Verifiability
of terms
State AG Litigation Contract
You are a lawyer
in the consumer protection division of the state attorney generalÕs office.
Preliminary investigations as well as some undercover stories in the press
reveal the possibility of a major billing scandal involving the health care
industry. Following the growing number of states who have recently pursued such
claims and the recent huge success in tobacco litigation, it is proposed to
bring suit against a number of firms. The total damages claim is for hundreds
of millions of dollars, possibly more than a billion.
Your office, however, has only four
attorneys, many of whom are quite busy on other matters. Therefore, it is
agreed to hire an outside firm that specializes in large-scale litigation,
probably one of those super-successful plaintiffsÕ boutique firms. Many of them
have already expressed interest and some have been interviewed.
Two further notes.
First, although this novel litigation strategy has the potential to be
extremely lucrative, it will also be expensive, requiring that millions of
dollars worth of lawyersÕ and expertsÕ time be invested up front. Second, the
office is worried about the possible political fallout of making fee payments
to outside lawyers that prove embarrassingly large.
Advise your
department head on the compensation scheme that should be used in the contract
with the outside firm. Focus on the form of the compensation scheme and any
closely related matters. In preparing your advice, be sure that you do each of
the following:
Describe different
ways that the firm could be compensated.
Identify the major
pros and cons of each approach.
Discuss how, if at
all, any negatives of a given approach may be mitigated.
Compensation
Scheme |
Incentives |
Risk |
Political, Other |
Flat Fee |
|
|
|
Hourly |
|
|
|
Contingent Fee |
|
|
|
v
Flat Fee
¯
Incentives
¤
No
financial incentive for lawyers to win
¤
Possible
reputational incentive
¤
How well
can a small AG's office monitor the lawyers?
¤
Can you
control how hard they try by contract?
¯
Risk
¤
None on
payment for law firm
¤
But they
bear all the risk of costs
¤
Who is more
risk averse?
¯
Political
¤
No risk of
stories on huge fee payment, but É
¤
If the case
fails, agency looks bad--money for nothing
v
Cost-Plus
(hourly)
¯
Incentives
¤
To spend
too much time if rate is higher than real cost of time to firm
á
Too little
if rate is lower, but É
á
Less of a
problem than the previous case, where hourly rate is zero.
¤
Can you
verify
á
Hours
actually worked
á
Quality of
work. Who do they assign, how hard does he try?
¤
Can you
control by contract?
¯
Risk
¤
All of the
revenue risk is born by the state
¤
And most of
the cost risk
¯
Political
¤
No risk of
huge payments for now work, but É
¤
Risk of
huge payments for no return
v
Contingent
Fee
¯
Incentives
¤
Firm wants
to win.
¤
How large a
fractional payout?
á
Higher
percentage, better incentives, but É
á
Less left
for the state
á
What about
100% and negative fixed fee?
¤
At anything
less than 100%, incentive still imperfect. Assume 50%.
á
If it costs
the firm $1000 to increase expected return by $1500, they won't do it.
á
So still
want some oversight
á
And hope
reputation helps.
¤
No
incentive for the firm to get relief other than a damage payment
¯
Risk
¤
Is being
shared between firm and state
¯
Political
¤
No risk of
large payment for no result
¤
But very large
amounts to lawyers if the suit is successful might be embarrassing
v
What is the
maximand?
¯
Suppose the
defendant is actually innocent
¯
The law
firm still wants to win
¯
Does the
state?
v
School
Gymnasium: Applying what we have learned.
¯
Flat fee or
Cost plus?
¤
The school
probably doesn't know enough to monitor a cost-plus contract
¤
And is
probably in a poor position to bear risk
¤
So flat fee
is probably better, but É
¯
Problems
with flat fee
¤
Maintaining
quality
á
Have to
specify a lot of details
á
School
doesn't have the expertise to do that, but É
á
Their
architect might.
á
Hire some
sort of expert to write the specs
¤
Making
changes
á
Question
your client carefully to keep later changes from being necessary
á
Perhaps
include in the contract that changes can be made on a cost plus basis
á
Or plan on
negotiating changes.
v
Arguments
in litigation
¯
The book
sketches the law and econ argument for enforcing the quality terms in a flat
fee contract
¤
Because
otherwise the builder has an incentive to degrade quality
¤
Even when
doing so costs you more than it saves him.
¯
Do you
think a judge would find that more or less convincing
¯
Than the
"good faith" sort of argument?
v
Principle/Agent
Contracts
¯
Lots of
varieties, including
¤
Construction
contracts we have been discussing
¤
Employment
contracts
¤
Lawyer/client
contracts--you are the agent.
¤
Is the
President the voters' agent?
¤
É
¯
Possible
forms
¤
Pay by
performance--did you sell a car? Win a case?
¤
Pay for
inputs--how many billable hours?
¤
Fixed-fee
¤
Combinations.
á
Employees
frequently get a fixed salary, plus É
á
Bonus for
specified accomplishments, by them or their unit or the firm, or É
á
Optional
bonus--Google example.
á
Your raise
next year is to some extent a "by performance" for this year
¯
Incentives:
How to make it in the interest of the agent to do what the principal wants
¤
What does
the principal want?
á
"To
win her lawsuit?"
á
At any
cost?
¤
Performance
based contracts give the agent an incentive
á
To achieve
the objective
á
If the
reward for doing so is greater than the cost of doing so
á
Suppose the
reward is 10% of the value of success
á
Will the
agent act as the principal would like?
á
What about
200%?
á
If all we
are concerned about is the right incentive, the reward should be É?
á
What are
the problems with this solution?
¬
It might
pay the agent too much.
¬
Consider a
store whose profit depends on ten different employees.
¬
How would
we solve that problem?
¬
The
solution might impose too much risk on the agents.
á
So there
are costs to the rule that gives the right incentive.
á
A further
problem is measuring output
¬
Consider
the President of a publicly traded company
¬
Perhaps
profits are low this year because of high research costs which will bear fruit
in five or ten years
¬
Or because
of problems facing the industry for which he is not responsible.
¬
Consider a
secretary or janitor or É . How do
you measure output?
á
One reason
to decentralize firms is to make this problem a little easier to solve
¬
We can
judge the output of the Buick division of GM better if it is run like a
separate company
¬
Of one
partner in a law firm if we can keep track of his accounts
¯
Input based
contract
¤
For
instance, paying an hourly wage
¤
Or billable
hours
¤
Gives the
agent an incentive on the measurable dimension of input
¤
But not on
other dimensions--how hard he works, for instance.
¯
Fixed fee
contract
¤
No
automatic incentive to do anything
¤
Make the
fixed fee for some measurable result (show up in court, etc.)
¤
Or have
some way of defining what inputs the fixed fee is buying, and monitoring them.
¤
May rely
heavily on reputation.
v
Risk
bearing
¯
Performance
based, risk born largely by the agent
¯
Input
based, principal bears risk of outcome, risk of wanting more inputs.
¯
Fixed fee,
principal bears risk of outcome, agent risk of costs.
v
Coffee
house manager employment contract
¯
Performance
based
¤
Do we have
to base it on the profits of the whole firm?
¤
Or is there
a better solution?
¤
What about
compromises to reduce the risk the manager bears?
¯
Input based
¤
Performance
depends on manager's inputs, but É
¤
Much of it
is qualitative, hard to measure, harder to prove to a court in case of dispute
¤
And the
quantitative--hours put it in--requires someone monitoring the manager
á
Which means
someone working in his coffee house
á
And so
partly dependant on him for promotion etc.
¯
Fixed
fee--flat salary
¤
Requires
monitoring of inputs and performance
¤
If unsatisfactory,
replace the manager
v
Joint
undertakings
¯
Include
¤
Partnership--such
as a law firm
¤
Joint
project by two firms--Apple and IBM, say
á
IBM
develops a new chip (G5, 60 nm)
¬
Apple makes
plans and promises based on it
¬
And Steve
Jobs eats crow when he still doesn't have his 3 Ghz desktop.
á
How might a
contract deal with this (don't know if it did)
¬
IBM
controls how hard they try
¬
And has
more information on what they can do, risks (not enough information, as it
turned out—everyone had more trouble with 60 nm than expected)
¬
So should
IBM be liable for Apple's losses?
¬
But Apple
is the one deciding what promises Steve makes, other decisions affecting amount
of loss.
¤
É
¯
Incentives
¤
Horizontal
division—between partners, allocating income by business brought in,
billable hours, É
¤
Functional
division—Apple and Motorola above.
¯
Risk
sharing
¤
May modify
"reward by output" within firm
¤
Partly
output, partly input, partly fixed
¯
What is
observable?
¤
Did IBM
make best efforts to develop?
¤
Could Intel
be used as benchmark?
¤
Did Apple act
to minimize loss due to failure of IBM to deliver?
v
Sale or
lease of property
¯
Quality
dimension
¤
Of property
as delivered
¤
And as
returned
¤
Inspect?
¤
Contractual
restrictions on use, subletting, É
¤
Security
deposit
á
Saves court
costs if property damaged,
á
Solves
judgement proof problem, but É
á
How do you
keep landlord from confiscating it if not damaged?
á
Raises the
general issue of structuring a contract wrt what happens if nobody goes to
court. Will return to that Thursday
¤
Damage in
delivery
á
Make the
party who has possession liable? Can best control
á
Or the
party who chooses third party to deliver
¯
Information
¤
What are
you obliged to tell?
¤
Treaty of
Paris, war of 1812, case.
¤
Poltergeist
case
¯
Who bears
the risk of the rented building burning down?
¤
Incentive—tenant
¤
Risk
spreading? Probably landlord.
v
Loan
¯
Risk of
bankruptcy,
¤
deliberate
or otherwise.
¤
"deliberate"
might include taking risks—heads I win, tails you lose.
¤
Control by
á
Security
interest in property—borrower can't sell it
á
Controls on
what borrower can do.
v
Resolving
disputes
¯
Some can be
avoided by anticipation, but É.
¤
There isn't
enough small print in the world to cover everything
¤
And events
may occur that you hadn't thought of.
¯
Damages for
breach
¤
Expectation
damages lead to efficient breach, inefficient reliance
¤
Liquidated
damages solve the problem—if damages can be estimated in advance.
v
Negotiating
the contract
¯
Try to
maximize the pie
¤
By offering
to buy improvements that help your side at a cost to the other
¤
To sell
improvements that help them at a cost to you
¤
To trade
¯
Try to
maximize your share—typically in the price
¤
While
remembering that if you ask for too much
¤
You risk
bargaining breakthrough
¤
And getting
nothing
v
China to
Cyberspace: Contracts without court enforcement
¯
An issue
for
¤
You—because
part of an attorney's job is staying out of court
á
Which you
do in part by designing contracts
á
Which it
isn't in either party's interest to try to get out of
á
Look at how
many contracts amount to the consumer signing away as many of his potential
claims as possible
¬
One explanation
is that it is that way to benefit the seller at the buyer's expense
¬
That seems
inconsistent with our analysis—any expense to the buyer will reduce what
he is willing to pay for the product
¬
Why might
this arrangement be in the interest of both? (stay tuned)
¤
Imperial
China—because legal system was almost entirely penal
á
You could
complain you had been swindled, ask the district magistrate to act
á
But you
couldn't actually sue and control the case
á
And the
legal system said almost nothing about contract law
¤
Cyberspace,
because
á
Hard to use
the legal system when dealings routinely cross jurisdictions
á
The
technology makes it possible to combine anonymity and reputation
¬
Public key
encryption as a way of maintaining anonymity
¬
And digital
signatures as a way of proving identity
¯
Either your
realspace identity, or É
¯
Your
cyberspace identity
¯
I.e. that
you are the online persona with a particular reputation.
¯
My legal
eagle business plan
á
For quite a
lot of people, anonymity might be a plus
¬
Lets you
opt out of the state legal system—which contracts often try to do.
¬
Protects
you in places where security of property is low
¯
Do you want
to be a programmer known to be making $50,000/year
¯
In China,
or Burma, or Indonesia, or É
¯
You might
be worried about either private seizures—kidnapping your kids, say
¯
Or public
ones.
¬
Might let
you evade taxes or regulations at home.
¯
One way of
enforcing contracts without the courts is reputation
¤
Reputational
enforcement depends on your being a repeat player, so your reputation matters
to you.
¤
It also
depends on interested third parties knowing whether you cheated someone
á
Since your
"punishment" isn't designed to punish you
á
But to keep
other people from letting you cheat them
¤
If it is
hard to know which party to a dispute is telling the truth
á
Interested
third parties will distrust both—either might be lying
á
So it isn't
in your interest, when cheated, to complain
á
So
reputational enforcement doesn't work
¤
Arbitration
is a way of lowering the information cost to third parties
á
If we went
to a respected arbitrator, or one we agreed on advance
á
And he
ruled in my favor, and you didn't go along
á
You are
probably the bad guy
¯
Another way
is structuring the contract so that it is never in either party's interest to
breach
¤
I hire you
to build a house on my property
á
If I pay
you at the beginning, it is in your interest to take the money and run, if you
can get away with it.
á
If I pay
you at the end, it is in my interest to keep the house and not pay
á
So I pay
you in installments during the construction
á
Arranged so
there is no point at which either of us gets a large benefit from breach
á
Sometimes
doing this requires costly changes in the pattern of performance
¬
Lloyd
Cohen's explanation of the consequences of no fault divorce
¯
In the
traditional marriage, women performed early, men late
¯
Many men
find younger women more attractive, so É
¯
Incentive
for a husband at forty, with the kids in school and his wife finally getting a
chance to rest
¯
To dump her
for a younger replacement
¬
How did
women change their behavior to control the problem?
¯
Postpone
childbearing in order to bring performance more nearly in sync
¯
Shift
household production to the market and get a job
¤
Which both
gets performance in sync, and
¤
Reduces the
degree to which the wife is specialized to being the wife of that man
¤
And so at
risk if he breaches.
¤
Since there
are gains from completing the contract, in a world of certainty we ought to be
able to structure payment and performance to achieve this, but É
á
In an
uncertain world, where costs and benefits may change, it's hard
á
We can
always reduce my incentive to breach by my giving you a deposit at the
beginning, which you hold and will keep if things break down
á
But that
increases my incentive to breach
¤
One
solution is to use a hostage instead of a deposit
á
I give you
something—my son, my trade secret—that
¬
it costs me
a lot to lose
¬
but
benefits you only a little to keep
¬
so pushes
down my benefit from breach a lot, yours up a little
¤
Another
solution is to structure payments so that the incentive to breach is on the
party who has reputational reasons not to
á
You are
going to do some work for me online—write a program, say
á
If you are
a repeat player with reputation, I pay in advance
á
If I am, I
pay for the program when it is delivered
á
Arguably,
these explains the feature of real contracts discussed above
¬
It is in
the interest of both parties to avoid expensive litigation
¬
The seller
is a repeat player with a reputation, the buyer is not
¬
So
substitute reputational enforcement for court enforcement
¬
Which would
you prefer
¯
To buy a
product with a long warranty from Apple or Kitchen Aid—in a world where
the warranty wasn't enforceable
¯
Or from a
no-name seller, in a world where you could sue the seller for not carrying out
the warranty?
v
Other ways
of staying out of the court
¯
So far as
possible, arrange the contract so that the result you want is the one that
happens with no court intervention
¯
Caveat
emptor is an example
v
General
observations
¯
a bunch of
simplifications
¤
cost rather
than current value
¤
Assets:
á
must be
linked to some past transaction or event
á
yield
probable future benefits
á
be obtained
or controlled by the entity
¤
assets
donÕt include good will, corporate culture, É
¤
all
probabilities are one or zero
¤
why?
¯
Compare to
tort law
¤
All
probabilities are one or zero
á
Someone
sues you for ten million dollars
á
If
probability of guilt is .4, you owe nothing
á
If .6, you
owe ten million
¤
Damages
tend to be limited to
á
pecuniary,
medical costs, lost earnings
á
less
willing to include pain and suffering and the llike
¯
in both
cases, we have to make decisions with a very crude process
¤
making
legal outcomes depend on things in complicated ways is likely to raise
litigation costs, legal uncertainty. Easier to prove a doctorÕs bill than a
pain.
¤
Accounting
aims at sufficiently clear cut decision rules
á
So that
firms canÕt easily manipulate the outcome
á
To make
them look good
á
Or reduce
their taxes.
á
At a
considerable cost in accuracy
v
Understanding
accounting
¯
First
rule—ignore Òdebit/creditÓ or reverse their meaning
¤
Most of the
time, a debit makes a firm richer
¤
A credit
poorer
¤
One
explanation: "Debit" is from Italian Debitare—what others owe
you
¤
And what
about credit?
¯
Second
rule—()= -
¯
making
sense of a balance sheet
¤
photograph
of the firm at an instant—compare two dates
¤
show a list
of assets, most liquid at the top, at two periods
á
group into
current assets, total
á
and long
term ("property, plant and equipment") and total
á
total the
two totals for total assets
¤
similar
list of liability and owner's equity
á
liability a
negative asset
¬
probable
sacrifice of economic benefit É
á
why do you
put equity with liabilities?
á
How much
wealth does the firm itself (as opposed to stockholders and others) have?
á
the
fundamental equation
¯
making
sense of an income statement
¤
designed to
show the changes over a period of time
¤
money coming
in: Sales revenue (or equivalent for other sorts of firms)
¤
costs
á
cost of
goods sold—raw material, labor, etc.
á
operating
expenses: Costs not attributable to particular output
á
interest
expense
á
income tax
expense
¤
at each
stage, you have a net to that point
¤
and end up
with net income
¯
making
sense of a cash flow statement
¤
the one in
the book
á
money comes
in as net income, but É
á
if part of
the "income" is accrued but not received É
¬
it goes
into accounts receivable, not cash,
¬
so less
cash
¬
reverse if
some accounts from last year are paid, increasing cash
¬
so subtract
from income the increase in accounts receivable
á
accounts
payable the same thing in the other direction
¬
we
subtracted out expenses in calculating income, but É
¬
if some
expenses were accrued but not paid É
¬
we still
have the cash
á
we
subtracted out depreciation in calculating net income—but they didn't use
up any cash. Add back in.
á
also cash
flows from
¬
borrowing
(increases cash)
¬
paying
dividends (uses up cash)
¬
etc.
¯
making
sense of T-accounts
¤
The T-account
records a single transaction, not a balance or a total over time
¤
each
transaction is entered twice
¤
if you buy
something
á
that
decreases cash, increases asset (land, factory, raw materials)
á
if you sell
something, increases cash, decreases inventory
¤
what if you
make money?
á
Buy
something for $100
á
Sell it for
$200
á
How do you
make the accounts balance?
¯
Joyce James
Case
Joyce
James graduated from college in June 2002. As was traditional in the James family, JoyceÕs parents paid all of her
expenses through college. But,
upon graduation, she was expected
to fend for herself financially.
On the date of her graduation, Joyce had neither financial
resources nor financial
obligations. Now that she is
responsible for her own finances, one of her friends has suggested that she might want to think
about putting together a financial statement of some sort. What sort of financial statement do you
think would be useful for Joyce?
How would you propose
she account for the
following transactions?
1.
At her graduation exercises, Joyce was awarded a prize of $5,000 her senior
thesis on Day Hiking in Ireland.
The prize came in the form of five one thousand dollar bills.
2.
She spent $2,000 of the prize money buying books she would need for graduate
school,
which
she was planning to attend in September.
3.
She spent another $2,000 traveling through Europe over the summer and
collecting memories of a lifetime.
4.
At the end of the summer, she took out a $4,000 loan to cover the costs of
graduate school.
v
Why might she
want to work out a financial statement?
¯
To keep track of
her situation—decide if she is too much in debt, etc.
¯
For other
people—to get a loan, É
v
What kind of
information is most useful to her?
¯
Probably a
balance sheet, showing her assets and liabilities
¯
But to get there
we will use T-accounts—more for our information than hers.
v
How do we record
the prize?
¯
She gets $5000 in
cash—where does that go?
¯
What's the
balancing item?
¯
Is there a
reduction in some other asset?
¯
An increase in a
liability?
¯
If not, what's
left.
v
She spends
$2000 on books for grad School
¯
Where does
the expenditure go?
¯
What's the
balancing item?
v
She spends
$2000 traveling in Europe and collecting memories?
¯
Where does
the expenditure go?
¯
Are the
memories an asset?
¯
If not,
what balances the expenditure?
v
She takes
out a $4000 loan to cover the costs of graduate School
¯
Where does
the loan go?
¯
What
balances it?
v
She spends
$5000 on living expenses in graduate school
¯
Where does
the expenditure go?
¯
What
balances it?
v
Now put
this all together for a balance sheet
¯
What are
her assets?
¯
What are
her liabilities?
¯
What is her
equity?
v
Is this an
accurate account of her actual situation?
¯
For what
purpose?
¯
Are you
thinking about loaning her money?
¯
Or marrying
her?
v
The
matching principle
¯
So far as
possible, we want to put revenue and the associated costs in the same period
¯
So that we
can see how what we are doing is affecting us
¯
So we try
to recognize income when it is earned, not when we get it
¤
By showing
it as an increase in accounts receivable
¤
If it isn't
actually going to get paid until the next period
¯
And defer
costs to when they will generate income
¤
Depreciation
is an attempt to do that
¤
Your
computer isn't a cost for this year but a cost spread over several years
¤
What
happens if you buy identical inputs at different prices?
á
What value
to use to measure them when you sell them (or use them)?
á
FIFO or
LIFO?
v
The
lawyer's perspective—why does all this matter to you?
¯
Contracts
may specify things in terms of accounting entities
¯
The
decision to make a loan may depend on accounting figures on the borrower
¯
Firms have
legal obligations with regard to accounting, especially publicly traded firms,
and you may have to tell them if they are fulfilling them.
¯
Others?
Accounting
for Lawyers:
Upstage
Theater Company Handout
The Upstage Theater Company (UTC) is a non-profit community
theater group that puts on
several plays each year. On December 31, 2001, the Company had the following balance
sheet.
Assets |
|
Cash |
$2000 |
Costumes and Sets |
$3000 |
Total Assets |
$5000 |
Liabilities
and Surplus |
|
Bank Loan |
$4000 |
Total Liabilities |
$4000 |
Surplus |
$1000 |
In the course of 2002, the following events occurred. The company would like your advice on how to account for these transactions.
1. At
the beginning of the year, an anonymous donor makes an unrestricted gift
of $1,000 to UTC.
2. The
company spends $1,000 on costumes and sets for the coming season.
3. Over
the course of the year, the company sells $3,000 of tickets for the yearÕs
performances.
4. Over
the course of the year, the company spends $1,000 on the rental of auditoriums
and other costs associated
with putting on the yearÕs productions.
5. Towards
the end of the year, the company launches a new initiative to make advance
sales of tickets for the
next yearÕs season. $1,000 in advance sales are made.
v
$1000 gift
¯
where does
it go?
¯
What
balances it?
v
$1000 spent
on costumes and sets
v
Sell $3000
¯
$3000 to
cash
¯
could be
balanced by equity, but É
¤
we want to
keep track of income and expenses
¤
so as to be
able to write an income statement
¤
so put it
there
¤
and plan to
transfer to equity when we close the books, subtract expenses
¤
add net
income to equity
v
Spent $1000
on rental etc.
¯
Subtract
from cash
¯
Balance
where?
v
Make $1000
in advance sales
¯
Add to cash
¯
Balance
where?
¯
Is this
income?
¯
Liability?
Do we owe it to anyone?
¯
From the
standpoint of an income statement, we owe it to next year's income
¯
Since we
want to match up income with expenses
¯
So it goes
to deferred income
v
Costumes
don't last forever--how do we include depreciation in this?
¯
Suppose
four year lifetime--25% depreciation each year
¯
Where does
it go?
¤
Reduction
of inventory, and É
¤
Could be
reduction of equity, but É
¤
We are
trying to keep track of expenses, so
¤
Goes to
expenses
v
At the end
of the year we close out the books
¯
Add up cash
credits and debits, starting cash, gives final cash
¤
$2000
initial
¤
+$5000
debits
¤
-$2000
credits
¯
Add up
costumes etc:
¤
$3000+$1000-$1000
¤
=$3000
final.
¯
Add up
income and expenses, transfer to equity (Surplus)
¯
End up with
a balance sheet
¤
$5000 cash
+ $3000 costumes and setsw = $8000 assets
¤
$1000
deferred income +$4000 loan+$3000 equity=$8000
v
using the
information
¯
a potential
donor wants to know if his money is
¤
needed
¤
going down
the drain ("throwing good money after bad)
¯
a potential
lender wants to know if the company will be able to pay him back
¯
a
government agency that wanted to subsidize the arts might want to know if these
are good people to subsidize.
v
Summary of
what we have done
¯
Asset
adding to equity (donation)
¯
Asset
converted to another use (cash to costumes)
¯
Cash
balanced with income (ticket sales, will go to equity after netting costs when
books are closed)
¯
Expenditure
balanced with expenses (rental etc.--will go to É)
¯
Asset
balanced with a liability to the future (advance sales)
¯
Depreciation:
Reduce an asset, balance with an expense, will go to É
v
Back to the
matching principle
¯
Ambiguity
¤
Revenue
should be allocated to the period during which effort is expended in generating
it
¤
An expense
should be allocated to the period in which the benefit from it will contribute
to income generation
¤
So if
expense is in year 1, revenue in year 2
¤
Do you move
expense forward or revenue back?
¯
If
"effort expended" has an unambiguous date, move it to that year?
¯
Otherwise,
answer depends on when you have the information?
¤
If we have
an expense this year for income next year
á
We probably
don't yet know the amount of the income
á
So move the
expense forward--"prepaid expenses"
á
Similarly
for "Deferred Income"--we'll know more next year
¯
"accounts
receivable" go the other way--moving income back
¤
because the
amount is (hopefully) known
¤
as are the
rest of the associated expenses and income
v
fixing the
oil problem (Figure 4-6)
¯
what was
left out of the story and the accounts?
¯
What
happens when we put it back in?
¯
We are
moving profits into equity--eventually
v
"Conservative
bias"
¯
a
misleading term if it means "err in the direction of underestimating
income and equity"
¤
intangibles,
after all, can go down as well as up
¤
as can the
market value of assets
¤
and
ignoring changes in overall prices actually overstates income--eventually
á
Buy
something for $1000
á
All prices
double
á
Sell it for
$2000
á
Accounting
profit: $1000
á
Actual
profit: zero
¯
more nearly
a sceptical bias
¯
Err in the
direction of ignoring things hard to measure
¤
Count
intangibles if they have actually been bought at a price
¤
Use market
value for financial assets where it is easily determined
v
Defining an
entity
¯
How to
reduce your taxes
¤
Have a
small business
¤
Treat
expenses for things used in your business and for consumption as business
expenses
¤
IRS rules
try to prevent this--home office, automobile, etc.
¤
But you are
the one structuring and monitoring things
¯
What is
happening is that you are (deliberately) blurring the lines between two
entities
¤
Your
business and
¤
You
¯
How to run
a law school at a profit (or loss)
¤
Some costs
could be counted as costs of the Law School or of the whole university
á
Maintainance
of our lovely campus
á
Some
publicity costs
¤
Attribute
them to the university, and the law school is making a profit
¤
To the law
school, and it is making a loss
¤
Sometimes
law schools or business schools have agreements with the university they are
part of
á
Defining
how costs are divided
á
And how
much of the school's revenue the university is entitled to
á
Which might
be based on profit rather than revenue
á
In which
case the accounting matters
¤
Sometimes
it might pay to move some activities into the law school to make those lines
clearer
á
Suppose the
Law School thinks the university charges it too much for keeping track of
student records
á
Shift to the
law school having its own people keep track of its students
á
Have
lunches in the faculty lounge instead of
Benson
¤
Do you
prefer a profit or a loss?
á
Depends who
you are talking to
á
If you owe
a percentage of your profit to the University, prefer a loss
á
If you are
raising money, probably prefer a profit--but not too big a profit.
¯
Enron
¤
Create an
entity whose books your firm's accountants won't see
¤
Shift
losses to it
á
Sell
something to the entity at much more than it is worth
á
Or buy
something for much less
¤
Or shift
gains to it before Enron goes bankrupt--and make sure you control the entity
¤
One reason
lawyers worry about making sure transactions are "arm's length."
2/14/06
Snowplowing
Joe
Landscaper and Gill Snowfall are both in the business of plowing driveways for
a number of years. Their only
revenues are payments they receive for their plowing services. Their only expenses are from the
purchase of gasoline and the wear and tear on their trucks.
A.
Joe plows driveways in December and is paid $500 in cash.
B.
Gil also plows driveways in December and sends his clients bills for $ 600.
C.
Joe gets $200 of gas in December and puts it on his credit card.
D. Gill buys $250 of gas in December
and pays cash.
Who
had a better month?
E. On January 1st, GillÕs old truck dies
and he decides to purchase a new truck for
$10,000.
How
would you account for this transaction?
v
Payments
for Joe and Gil in December
¯
Joe gets
$500 in cash, balanced by ?
¯
Gil gets
$600 in ? balanced by ?
¯
When are we
recognizing income?
v
Expenses
for Joe and Gil in December
¯
Joe buys
$200 of gas on his credit card
¤
$200 in
expenses
¤
where does
the matching $200 go?
¯
Gill buys
$250 of gas, pays cash
¤
$250 in
expenses
¤
and É?
¯
We want
both expenditures to be in this period
v
Who had a
better month?
¯
Compare
their income statements
¯
Whose
income minus expenses figure was larger?
¯
Whose cash
has increased more?
¯
Which
matters? When? To whom?
v
Gill buys a
new truck
¯
Asset for
asset swap
¯
At what
point does the cost of the truck show up as an expense?
v
Does this
fit the matching principle?
v
Stuff I'm
leaving out
¯
The
discussion of standards, boards, etc. matters
¯
But it
isn't about analytical methods, although it is about accounting
¯
So you can
probably make sense of it just as well without my help as with it.
v
How to use
the information accounts provide?
¯
Who are
you?
¤
Investor,
interested in long term expectations of the firm
¤
Lender--wants
to know if he will be paid back
¤
Supplier--wants
to know if he will be paid. Lawyer, for instance.
¤
Employee
¯
Will the
firm be able to meet its short term obligations?
¤
Compare
short term assets (Cash, accounts receivable, inventory)
¤
To short
term liabilities (bills payable, short term loans, É)
¤
Is
"assets more than liabilities" enough?
¤
Depends how
fast that is likely to change
á
Lender has
some control over that via contract
á
Can require
borrower to maintain some financial ratio
¤
Rule of
thumb: current assets should be at leat 1.5 to 2 times current liabilities
¤
What if
current assets almost all in inventory? In accounts receivable?
¤
How could a
firm improve its short term situation?
á
Take out a
long term loan
á
Increase
its cash, or É
á
Reduce
short term debts
á
Does not
increase long term solvency, but É
¬
The fact
that someone is willing to make a long term loan to them
¬
Is evidence
that the lender thought they were solvent
¬
But É might want to check on the interest
rate.
¯
Is the firm
solvent--long term obligations?
¤
Look at
ratio of liabilities to
á
Assets, or
É
á
Equity.
á
Are these
really different measures?
á
Could a
firm look good on one and bad on the other?
¤
Leverage
á
Consider a
firm with $10 million in assets, $9 million in liabilities
á
What are
the good things about that situation?
á
What are
the bad things?
á
For whom?
¬
Stockholders
¬
Lenders
á
Why would
much higher degree of leveraging be acceptable in some industries than in
others?
¬
How
predictable is the value of Apple's inventory of iPods vs
¬
Merrill
Lynch's inventor of securities?
¤
Look at
interest payments vs earnings available to pay them
á
Interest
coverage
¬
Calculate
from Figure 4-3
¬
Operating
Earnings/Inerest expense
á
How close
is the firm to being unable to pay interest on its debts?
¯
How well
run is the firm?
¤
Accounts
receivable/sales revenue--how long does it take the average customer to pay?
á
Depends on
the industry as well as management
á
How long
does it take MacDonald's average customer to pay?
¤
Turnover
ratio: How fast does the firm turn over its inventory?
á
"Just
in time production" is a limiting case
á
But a firm
that is doing a bad job of estimating demand will have inventory build up, or É
á
Be short--high
turnover ratio could be evidence of a mistake
á
But also
success--high demand for their goods.
¤
What is the
average interest rate the firm pays on its loans?
á
A high rate
might be evidence of bad shopping for loans, or É
á
A high risk
premium
¤
For all of
these, one would want to compare to other firms in the same industry
¯
How
profitable is the firm?
¤
Note that
"profit" means a lot of different things
¤
Revenue
minus cost
á
The
supermarket pays a dollar for that box of cereal
á
Sells it
for two dollars
á
So their
profit is 100%!
á
If only we
cut out the middle man É
¬
Set up a
consumer's coop
¬
Get
government to distribute food instead of the supermarket
á
But all of
those alternatives require
¬
Salary to
employees
¬
Rent,
utilities, maintainance on the facilities
¬
Interest on
the money used to buy the inventory
¬
Allowance
for spoilage, unsold goods, theft, É
¤
Operating
Earnings/Revenue
á
Operating
Earnings: Revenue minus cost of goods sold and indirect expenses
á
What is
available to pay interest, taxes, dividends, and increase equity
¤
Return on
assets
á
Net income
(after paying everything including interest and taxes)
á
Divided by
total assets
á
If this
company has a higher ROA than most, than either É
¬
It is
unusually well run (or lucky), or É
¬
Someone
else should get into the business too
¬
Duplicate
its assets with an investment I, get higher than the usual return.
¤
Return on
equity
á
Same as ROA
if no liabilities
á
Think of
equity as what the owners would get if they liquidated the firm--carefully.
á
If return
on equity is more than the market interest rate, they are better off keeping
the firm going
¯
Two
qualifications
¤
Some of
these will be different in different industries
¤
All of
these are subject to the problems with accounting as a measure
¤
Consider a
firm
á
whose chief
asset is land bought long ago for a million dollars, now worth $100 million
á
no large
liabilities
á
And
currently making $1 million/year
¤
Making $1
million on assets of $1million is stellar performance
¤
So is $1
million on equity of $1 million
¤
Is the firm
doing well? Should the owners keep going or sell out?
¯
Book value
of a share
¤
Equity
divided by number of shares
¤
A good
measure--if equity really measures what the stockholders own.
¤
The usual
problems
á
Historical
costs
á
Neglect of
intangibles
á
And
contingencies
¯
Earnings
per share
¤
Net income
(after everything)
¤
divided by
number of shares
¤
If an
accurate measure
¤
And if
likely to continue into the long future
¤
A good
basis for what the share is worth, but É
¤
If it isn't
worth that on the market, someone may know something you don't.
¯
Price/earnings
ratio
2/26/06
v
Taking
advantage of accounting flaws
¯
You are the
CEO of a company, and want its balance sheet to look good
¤
Perhaps you
are trying to get a loan
¤
Or issue
some new stock
¤
Or justify
your lavish retirement terms
¤
Or conceal
the fact that you've been stealing from the company
¯
What
perfectly legal steps might you take to increase equity
¤
As defined
by accountants
¤
Other than
increasing the real, long term value of the company.
¯
What if you
want the balance sheet to look bad
¤
Because you
want to drive down the stock price before your friend buys lots of it
¤
Or you are
planning to take the company private, and want to pay the stockholders as
little as possible
¤
How do you
lower equity as measured by accountants, without actually hurting the company,
at least very much?
¯
Why are the
answers to these questions of interest to you as a lawyer?
¤
One reason
is that you might want to advise a client as to legal ways of fooling people
¤
Is there
another--perhaps more ethically attractive--reason?
v
Animal
Rights league
¯
Shifting from
cash to accrual
¯
How to
account for pledges?
¤
Debit
pledges Receivable, credit Revenue
á
Next year,
$285,000 in pledges actually paid
á
Debit cash
$285,000, debit revenues $15,000 (pledge write-off)
á
Credit
pledges receivable $285,000 + $15,000 (two items)
á
Note that
pledges paid are an asset for asset swap
á
Pledges
written off reduce revenue
¤
orÉ
¤
Figure that
pledges are payment for future services
á
Debit
pledges receivable
á
Credit
deferred income
á
Then next
year
á
Debit
deferred income
á
Credit
revenues
¤
To decide,
ask whether the revenue is from the telethon or advance payment for next year's
work
¤
Third
alternative—expected value
á
On average,
$100 in pledges is only $95 in expected contributions
á
So debit
pledges receivable this year at $285,00, credit revenue with same
á
Next year,
credit pledges receivable, debit cash
á
More
accurate, less of a hard number (probability), more of an economist's approach,
less of an accountant's
¯
How to
account for moving expenses and salary of new executive director
¤
Capitalized
(an investment, to be depreciated) or expensed?
¤
Start by
crediting cash $150,000, which is no longer in your account
¤
If you
expense it, debit expenses by $150,000, easy
¤
If you
capitalize it
á
A new
asset—prepaid moving expenses, debit $150,000
á
Each year,
credit that by that year's share, debit the same amount to expense (of having
an executive director).
á
Amortize
1/5 each year
¤
What if you
capitalize it, and she quits after a year
á
Remaining
$120,000 is written off—investment that went bad
á
Credit
prepaid moving expenses (an asset, now reduced to zero)
á
Debit
expenses (which will get subtracted from revenue)
¤
Expensing
easier, more common, but É
¤
For a small
company, large expense, amortizing it may be more realistic
¤
Since
otherwise you lose lots of money the first year.
¯
Note that
both of these raise the question of allocating income and expenses to the right
period
¯
In both
cases, the way you do it depends on a guess about the future
¤
Pledges
might not be honored
¤
Jane might
quit
v
Energy
Cooperative
¯
Basically buying and selling fuel, with
a subsidy
¯
How to
account for cost of fuel bought: LIFO or FIFO
¯
Should they
write off the value of half the computers, now obsolete
¯
Constraint:
In default of a loan if
¤
Return on
Assets falls below 5% or
á
What is it
now
á
Net
Income=$31,000. Assets $300,000.
á
ROA>10%
á
No problem?
¤
Liabilities
to Surplus ratio above 200% ("Surplus"="Equity"
á
What is it
now?
á
200%. Oops.
¯
LIFO will
raise the
¤
(accounting)
cost of fuel sold (priced at higher current price)
á
So next net
year's income will be less if we switch to LIFO
á
Lowering
the ROA--but unless the effect is very big, still no problem
¤
what about
the value of inventory?
á
Does not
affect the left hand side of the accounts--total value of what you bought is
what you paid for it
á
Affects the
right hand side--LIFO means inventory value falls faster as you sell oil
á
Since you
are "selling the more expensive (later) oil first"
¤
So assets
will be lower at the end of next year if we use LIFO
á
Which
raises ROA, reducing any problem from lower income. But
á
Lower
assets mean lower surplus mean higher liabilities/surplus
á
Oops We are in default
¯
Writing off
computers
¤
Credit
(Reduce) inventory, hence assets
¤
Reduce net
income by $20,000
¤
Reduce
surplus by $20,000
¤
If we did
it for this year, net income from $31,000 to $11,000
¤
Assets from
$300,000 to $280,000
¤
Pushing ROA
below 5%, in default
¯
In either
case, there are arguments for the change so É
¤
Before
making it
¤
See if you
can negotiate a change in loan terms, or É
¤
Refinance
v Review
The course so far.
v Decision Analysis
¯ Way of formally setting up a problem to help you decide what to do
¤ Does not provide the information:
á Choices to be made and how they are related (the graph)
á Probabilities
á Payoffs to the various outcome
á But it does point out to you what information you must obtain
¤ Set up a graph showing
á alternatives you choose
á alternatives chosen by chance, with their probabilities
á outcomes, with their payoffs--how much better or worse are you (or your client) if it comes out that way.
¤ Start at the right end--final outcomes
á At each point where you make a decision--the last one you will make--evaluate the expected value from each choice
á The final choice leads either to an outcome, with a value, or É
á To a further choice made by chance, and you can evaluate its expected value: the sum of probability times payoff
á One of the alternative choices you can make gives the highest payoff--eliminate the others (cut off the graphs)
á Now that decision point has a value, just like the payoff of an outcome--the expected value from making the right choice there.
á Do this for all your final decision points
¤ Repeat the process at the next decision point left, repeat for all those.
¤ Continue until you know all decisions you will make
¯ How do you get the information to set up the problem?
¤ Not from decision theory
¤ From your expert knowledge of the situation
¤ Your client's expert knowledge
¤ Research you can do, such as looking at similar cases to see their outcome
¤ Consulting with other experts
¯ Sensitivity analysis
¤ Since the numbers are probably uncertain
¤ It's worth varying them a bit, and seeing if your decision changes
¤ If the decision is very sensitive to some payoff or probability, perhaps you should investigate further to make sure you have it right.
¯ Risk aversion
¤ So far I have assumed you are maximizing expected return--the sum of dollar payoff times probability over all alternatives of the decisions controlled by chance
¤ For gambles small relative to your assets, that is the right thing to do
¤ For large gambles, the fact that additional dollars are probably worth less to you the more you have comes into play
¤ You have to ask yourself which gamble you prefer, not merely which has the larger expected return.
v Game Theory: Strategic behavior. My best moves depends on what he does, his best on what I do
¯ Bilateral monopoly bargaining
¤ Common interest in getting agreement
¤ Conflict over who gets how much
¤ Bluffs, threats, commitment strategies
¯ Many player game adds in the possibility of coalitions
¯ Can represent a game as
¤ A sequence of choices, like decision theory, but with two (or more) people plus chance making decisions
á Useful for solving a game by finding a subgame perfect equilibrium
á Very much like the decision theory approach
¬ start at the right, the last decision anyone makes
¬ figure out which choice at that point is in that chooser's interest, lop off all others
¬ do it for all the rightmost choices
¬ them move left and do it again
¬ I don't have to worry that if I do X he will do Y if I know that, once I do X, it will be in his interest to do Z instead.
¤ Note that this assumes away commitment strategies
á "If you do X I will do Y, which hurts you
á even though it hurts me too
á because knowing that, you won't do X, and that benefits me."
¤ In some games--one time, no reputation--commitment strategies are unlikely, so subgame perfect is a sensible approach
¤ In other games that is not the case.
¯ Represent as a strategy matrix (two player games usually)
¤ A strategy is a full description of what I will do given any sequence of choices by you and by chance
¤ Given my strategy and yours, there is some outcome, or expected outcome, that results
¤ So we can imagine a matrix showing all my strategies down the left, all yours across, outcomes for both of us in the cells
¤ And this approach, for a zero sum game, gets us to the Von Neumann solution
¤ A pair of strategies, each optimal against the other.
¯ One can look for a dominant solution to such a matrix
¤ As in prisoner's dilemma
¤ One choice is best for me, whatever you do
¤ Another best for you, whatever I do
¤ So we will choose those two
¤ Of course, there may be no such solution.
¯ One can look for a Nash equilibrium to a many player game
¤ My strategy is optimal for me, given what everyone else is doing
¤ The same is true for everyone else
¤ But we might be all better off if we all changed together
¤ For instance, from driving on the left to driving on the right.
¤ Or even if two of us changed together
¤ For instance, both rushing the guard instead of going back to our cells.
¯ Von Neumann solution to many player game
¤ Not in the book, not responsible for
¤ But I sketched the idea briefly.
¯ Schelling points
¤ In a bargaining situation, people may converge on
¤ An outcome perceived as unique--50/50 split, or what we did last time, É .
¤ Because the alternative is to keep bargaining, and that is costly.
¯ Moral Hazard: Economics not game theory but in the chapter
¤ If part of the cost of my factory burning down is paid by the insurer
á I will only take precautions whose benefit is enough larger than their cost so that they pay for me as well as for us
á So some worthwhile precautions won't be taken
á Applies to any situation where someone else bears some of the cost of my action.
¤ One solution is for the insurance company to require certain precautions
¤ Another is to reduce the problem by not insuring too large a fraction of the value
¤ But sometimes, moral hazard is a feature not a bug, because the insurance company now has an incentive to keep the factory from burning down, and might be better at it than you are.
¯ Adverse selection: Also economics not game theory
¤ Market for lemons--problem with used cars
¤ Might solve by guaranteeing the used car--but that raises moral hazard problems.
¤ Bryan Caplan on a blog: Why doesn't this destroy the adultery market?
á Why do you want him to leave his wife and marry you if
á He's the sort of bum who is first unfaithful to his wife and then dumps her?
á http://econlog.econlib.org/archives/2006/02/lemons_for_vale.html
v Contracting
¯ Basic idea: How to maximize the total gain from the contract. All the rest is bargaining over cutting the pie.
¯ Basic solution--give people the right incentives.
¤ Arrange it so that if something costs $10,000 and produces a combined benefit for the parties of more than that, it is done, if less than that, it isn't
¤ Where something might be
á What materials you use to build a house
á Searching for the best price
á Deciding to breach the contract
á É
¯ construction contracts: Two and a half basic forms
¤ fixed price
á incentive to minimize cost
á but also to do it by skimping on quality
¤ cost +
á no incentive to minimize cost
á or skimp on quality
¤ cost +percentage of cost
á incentive to maximize cost
á and build only gold plated cadillacs
¤ choose according to
á which problems are hardest to control
á who you want to allocate risk to
¤ ways of trying to limit the damage done by the wrong incentives in each case
á remembering that what you can specify is limited by
á what you know enough to specify (quality, for instance)
á and what you can observe.
¯ Other sorts of contracts add another interesting option
¤ Pay by results
¤ For instance a contingency fee for a law firm.
¤ Or commissions for salesmen
¯ We discussed
¤ Principal/agent
¤ Joint undertaking
¤ Sale or lease of property
¤ Loan
v Accounting
¯ Understand four things about the mechanics
¤ A balance sheet
¤ Cash flow
¤ Income statement
¤ T accounts
¯ And how they are related
¤ T accounts show each transaction
á Twice
á Once on the left side, once on the right
á Either because a gain balances a loss or
á Because a gain without a loss increases income and eventually equity
¤ Fundamental equation: Assets=liabilities+equity (assets-liabilities=equity)
á To keep that true when a transaction occurs, either
á Liabilities don't change (increase one, decrease one)
á Assets don't change (increase one, decrease one)
á Change in assets equals change in liabilities
á Change in assets or liabilities is reflected in change in equity
á Some combination of the above
¤ Complications
á Allocating income and expenses to the right time period—not always when income received or expenses paid
á Various simplifications of what is really happening, to reduce the influence of judgement calls and thus reduce the ability of the accountant or firm to manipulate results
¬ Purchase price rather than market value
¬ Ignore intangibles unless they were purchased
¬ Treat uncertain outcomes as zero probability (p<.5) or certain (p>.5)
3/7/06
v What is finance
¯ Analysis of decision problems involving
the allocation of resources over time
¯ In a world of uncertainty
v The nature of the firm
¯ Coase.
¯ Why is the capitalist beach made up of
socialist grains of sand?
¯ The inside contracting system
¤ Firm A makes gun stocks
¤ Firm B makes the barrels
¤ Firm C makes the receivers
¤ Firm D assembles and sells the guns
¤ What happens to B, C, and D if A is shut
down because its owner gets sick?
¯ More generally, think about an economy
which was markets all the way down
¤ Some parts of ours come close
á
The one
person law firm--but he probably hires a secretary
á
People who
mow lawns
á
Free lance
writers
¤ Markets work well for selling a well
defined good at a time--mowing a lawn
¤ For performance over time, we need
contracts
á
And we have
seen some of the potential problems that contracts raise
á
And the problems
with trying to control them
¤ So one solution is a firm instead
á
The
contract is "you do what the boss tells you within the following
limits"
á
And if you
don't like it you quit
¤ But that solution raises its own problems
á
Instead of
the costs of transacting in the market, you have
á
The costs
of monitoring your employees to make sure they are serving their employer's
interest, not just their own
á
Which gets
harder and more expensive the more layers of control there are.
á
Also É
¯ Berle and Means (actually Adam Smith)
problem
¤ If the firm needs a lot of capital it
organizes as a joint stock company
¤ Each individual stockholder has little
incentive to follow what the firm is doing or try to use his vote to affect it
¤ So management can do what it likes with
the stockholders' money
¤ Are there mechanisms to control this
problem?
á
Base
rewards on performance--bonuses, options
á
Takeover
bids and the threat thereof
á
Hedge fund
vs Mutual fund story
¬ Mutual fund managers get a fixed
percentage of funds they manage
¬ Hedge funds, a percentage of the increase
in fund value
¬ Both are potentially large stockholders
with an incentive to monitor management
¬ Some evidence that hedge funds do it
better
¯ Because their managers rewarded directly
for success
¯ Because mutual funds are judged by
relative performance, and hold many of the same stocks as their competitors
¯ Also, a controlling group of stockholders
might be able to benefit themselves at the cost of other stockholders
¤ Firm A owns a large chunk of firm B, gets
B to agree to contracts favorable to A.
¤ "Empty voting" story.
¤ Majority stockholders might take firm
private on terms favorable to themselves
¤ Are there mechanisms for controlling this
problem?
v Relevance to legal issues
¯ The size of the firm
¤ If firms want to merge, are there
benefits?
á
Relevant to
anti-trust law, where mergers are suspect
á
Stockholders
might be injured if managers are empire building
á
So Coaseian
arguments about what activities ought to be inside or outside the firm become
relevant
á
Also
relevant to a CEO simply trying to do his job, serve the stockholders.
¤ If a firm wants to spin off parts of it,
are there benefits?
á
If the firm
is worth more in pieces than as a whole
á
Stockholders
will benefit by the breakup
á
Management
might not
¯ Managerial discretion
¤ On the one hand, the reason the firm
exists
¤ On the other, an opportunity for managers
to benefit themselves at the cost of stockholders
¤ Parkinson story.
¤ Should "socially responsible"
firms be suspect?
á
Donation to
art museums, opera, É
á
Helping out
local schools?
á
Treating
employees better than the terms of the contract requires?
¯ Limits to majority stockholder control
v Coase, Miller/Modigliani, and simplifying
assumptions
¯ Coase analyzed externality problems in a
world with zero transactions costs
¤ Not because he believed we are in such a
world, but É
¤ To show that in such a world the
conventional analysis would be wrong
¤ Hence that the problems in some sense
came from the transaction costs
¤ Which is relevant to understanding their
implications
¤ If sufficiently interested, see several
chapters of my Law's Order
or my "The World According to Coase" on my web page.
¯ Miller and Modigliani analyzed the
equity/debt question in a world of perfect information etc.
¤ Because showing that the ratio doesn't
matter in that world
¤ Shows that the reasons it does matter
have to do with imperfect information and the like.
v Miller/Modigliani Theorem
¯ A firm can finance itself with debt or
with equity
¤ Debt means the obligation to pay a fixed
amount
¤ Equity gives a fixed share of the income
stream
á
Sort of
á
Since the
firm gets to decide whether to pay out dividends or retain earnings
á
But the
retained earnings go to the firm, which the equity holders own.
¤ Historically, equity pays a higher return
than debt
á
If saving
for the long term
á
You are
almost always better off owning stock than bonds
á
But É
¤ The return on equity is less certain
¯ Using debt is cheaper, so why not?
¤ The larger the fraction of the firm is
debt, the more highly leveraged it is
¤ All variation in firm income goes to the
equity holders
¤ So the uncertainty in the stock goes up,
raising the risk premium
¤ And at some point, the amount of equity
is low enough so that the lenders suspect their loan might be at risk--and
charge a higher interest rate.
¤ One of the points we looked at in the
previous chapter
v Jenson and Meckling
¯ Incentive of firm managers as a special
case of agency theory
¯ If you are my agent, I want you to act in
my interest
¤ But you will act in your interest
¤ So I try to make it in your interest to
act in my interest
¯ The problem results in three costs
¤ The cost to me of making you act in my
interest--monitoring
¤ The cost to you of doing things that will
make you act in my interest, so that I will hire you--for example posting a
bond that forfeits if you don't
¤ The net cost of your not acting in my
interest in spite of the first two
¤ Note that it's a net cost
á
If we can
predict that you will act in a way that benefits you by $2000
á
And costs
me $3000
á
The net
cost is only $1000
á
And that is
also the maximum cost to me--because knowing that,
¬ I will offer you at least $2000 less than
if you were not going to do that
¬ And you will accept at least $2000 less.
¤ So the total cost due to the agency
problem is the sum of the three
¯ In the case of a firm manager
¤ If he owns the whole firm, it's in his
interest to maximize profit
á
Taking account
of not only pecuniary costs (money)
á
But
anything else that matters to him
á
Such as
being liked by his employees or respected by his neighbors
á
Or not
working too hard.
¤ The more of the ownership goes to other
people, the less that is true
á
Just as the
factory owner who has insured against fire for 90% of the value will only take
precautions whose benefit is much larger than their cost
á
So the CEO
who only owns half the firm will only work harder if it produces at least $2 of
firm income for each $1 worth of effort
á
Except that
if other people own more than half, they might fire him if they see he isn't
working hard, or in other ways is sacrificing their interest to his
á
Which
requires monitoring by the stockholders
á
Which is
hard if stock ownership is dispersed
¯ So there is a real advantage to the firm
run by its 100% owner
¤ And in many cases that is what we see
¤ The problem arises mostly if the firm
needs more capital than the owner's wealth
¤ Which could be borrowed--debt rather than
equity
¤ But the highly leveraged firm is risky
for the owner, and É
¤ The lenders
¯ There is also a real advantage to a firm
with concentrated stock ownership
¤ Because the large stockholder has an
incentive to monitor management
¤ And if necessary try to get together with
other large stockholders to replace it
¤ Hedge fund/mutual fund/stockholder
situation
¯ All of which explains part of why firms
are sometimes taken private
3/9/06
v Review
¯ Coase
¤ Firms exist because there are costs to
organizing cooperation by exchange and contract on the market
¤ There are also costs associated with
organizing cooperation by hierarchical authority
¤ A lot easier to buy paperclips and paper
on the market than to produce them yourself
á
Your top
management doesn't have to know how the production is organized
á
And
competition will get you the lowest cost.
¤ But having a key employee outsourced
could raise a lot of problems
á
You can't
do without him, so he could jack up his price, claim costs
á
And you
don't know and can't afford to turn him down.
v Berle/Means/Smith
¯ With dispersed ownership, stockholders
have little incentive to monitor the managers who are their agents for running
"their" firm.
¤ So managers can serve their own
objectives with the stockholders' money
¤ Which might mean being lazy or
incompetent
¤ Or paying themselves lots of money
¤ Or buying status by contributing the
firm's money to "worthy causes."
¯ Legal restrictions on such behavior are
weak
¤ ("business judgement rule")
¤ perhaps have to be weak if the firm is to
work as a hierarchical structure run by management
¯ Market restrictions exist via the threat
of proxy fights, takeovers
¤ Ownership of shares doesn't have to be
dispersed all the time
¤ Becomes concentrated if someone is buying
stock to get control
¤ Or via large institutional
stockholders--pension funds, mutual funds, hedge funds.
¤ What is a "junk bond" and why
is it called that?
¯ But conflicts over stockholder control
raise a new problem
¤ One group of stockholders, with an
effective majority, might benefit themselves at the expense of other
stockholders.
¤ Either by how the company is run, or É
¤ By taking the company private, or merging
it, on terms favorable to themselves
¤ The law tries to prevent this by
requiring equal treatment.
v Miller/Modigliani Theorem
¯ A company can finance itself with either
debt or equity
¯ Debt, historically, receives a lower
interest rate than equity
¯ But the higher the fraction of the
financing is via debt, the riskier the equity
¤ Why not a 100% debt?
¤ Who then is the residual claimant?
¤ And what happens to the risk of default
¯ It turns out that, if you make some
simplifying assumptions, the value of the firm is the same whatever the mix of
debt and equity it chooses
¯ Which suggests looking at the failure of
those assumptions in deciding what mix to use.
v The firm as a problem in agency theory
¯ How do principals control agents?
¤ By monitoring their behavior--at some
cost
¤ And punishing or firing them if they are
not acting in the interest of the principal
¤ My recent tire purchase as an example
¯ How much control should there be?
¤ The amount that minimizes the sum of
¤ Cost to the principal, cost to the agent,
net cost due to insufficient control
v Time value of money
¯ How do you compare a payment today with a
larger payment in the future
¤ Or a stream of payments over time with a
single sum today
¤ For instance the income from owning a
share of stock vs its present market value
¯ How compound interest works
¤ Suppose the interest rate is 10%=.1
¤ $1000 this year gives you $1000x(1+.1)
next year gives you $1000x(1+.1) x(1+.1) in two years, and so on
¤ if we call the interest rate r, then
¤ $1000 this year gives you $1000x(1+r)
next year gives you $1000x(1+r) x(1+r) in two years, and so on
¤ if the interest rate is small and the
number of years is small, adding works pretty well
á
1%
compounded over 5 years is only a tiny bit more than 5%
á
but 10%
compounded over 10 years is quite a lot more than 100%
¯ Suppose you are comparing $1000 today
with $1100 a year from now
¤ If you have $1000 today you can
á
put it in
the bank and get $1000(1+interest rate) in a year.
á
So the
$1000 today is worth at least $1000(1+r) in a year
¤ If you will have $1100 in a year you can
borrow against it.
á
If you
borrow ($1100/(1+r)) today
á
In a year
the debt will be ($1100/(1+r))x(1+r)=$1100
á
Which your
$1100 exactly pays off
¤ So $1000 today is equivalent to $1000
(1+r) in a year, where r is the interest rate
¯ This assumes
¤ That the future payment is actually
certain--future payments sometimes are not
¤ That you can borrow or lend at the same
interest rate--which you might not be able to do
¤ If you can't, the argument shows the
boundaries. $1000 is worth at least as much as $1000x(1+rl) in a
year, where rl is what you can lend at
¤ At most as much as $1000x(1+rb)
in a year, where rb is what you can borrow at
¯ Generalizing the argument, the present
value of a stream of payments over time
¤ Meaning the fixed sum today equivalent to
the stream
¤ Is the sum of the payments, each
discounted back to the present
¤ Where a payment in one year is divided by
(1+r), in two years by (1+r)x(1+r), É
v One example: You have just won the
lottery--prize is 15 million dollars
¯ Actually, half a million a year for
thirty years
¯ They offer you five million today as an
alternative
¯ And the market interest rate is 10%.
Should you accept?
¯ Harder versions
¤ How low does the interest rate have to be
to make you reject their offer
¤ Your interest rate is 10%, the state can
borrow at 5%. How much should they offer you?
¯ A useful trick
¤ What is the present value of $1/year
forever
¤ If the interest rate is r?
¤ There is, or at least was, a security
that works this way--a British Consol
v Another example: With risk
¯ The court has awarded you a million
dollar settlement, payable in five years.
¯ Of course, the firm might be out of
business in five years
¯ What is the lowest offer you ought to
accept, given that
¤ The prime rate is 5%
¤ You can borrow at 10%
¤ The firm can borrow at 15%
¯ First question: Why the difference?
¯ Second: Which rate should you use?
¯ First answer: the difference probably
reflects risk of default
¤ The market thinks that, each year, there
is about a 10% chance of default
¤ So a lender who lends $100 needs to be
promised $115 next year in order to get, on average, $105. (slightly simplified
because the two effects ought to compound, not add)
¯ Second answer:
¤ So you can use the market to estimate the
risk you won't be paid, assuming that the same conditions that lead to
defaulting on a debt lead to defaulting on a damage payment
¤ So you too should use 15% to discount the
payment in order to decide whether to accept an offer
¯ Alternative approaches
¤ You could make your own risk estimate
¤ And might have to if the conditions that
lead to one default are different than those that lead to another
¤ You might also want to use a higher rate
if you are risk averse, since banks probably are not.
v Choosing the interest rate to discount at
¯ Easy case
¤ Insignificant risk--the two alternatives
are really both certain
¤ You can lend or borrow at the same
interest rate
¤ Use that interest rate
¯ Hard case one--still risk free
¤ You must pay a significantly higher
interest rate than you can get
¤ If you have enough capital so that you can pay for present
expenditures by reducing the amount you are lending out, then your lending rate
is the relevant one
¤ if you have to borrow, then the borrowing
rate is the relevant one if in fact you will borrow
¤ if accepting later income instead of
earlier income means not borrowing but spending less this year, more in the
future, then the right rate is between the two numbers.
¤ Why?
¯ Hard case two: Risk, but you are risk
neutral
¤ Some risk that future payments won't be
made
¤ Try to estimate that risk and discount
accordingly
¤ Which can sometimes be made by seeing
what interest rate the future payer has to pay to borrow money
¯ Hard case three: You are risk averse
¤ The payers borrowing rate is a lower
bound to what you should use
¤ Try to estimate the risk and decide how
risk averse you are
¤ Or your client is, if acting as an agent.
v Why isn't the riskless interest rate
zero?
v How does the interest rate depend on
risk?
¯ For a risk neutral lender
¤ Why a bank should be very nearly risk
neutral
¤ Why a stockholder should be very nearly
risk neutral
¤ Against what sort of risk should a
stockholder not be risk neutral?
¯ Are there any special kinds of risk for
which a bank or stockholder could be expected to be risk preferring?
v Internal rate of return
¯ The same calculation we have been doing,
from the other direction
¯ You are given the choice between a
million dollars today and $100,000/year for eight years
¤ You calculate the interest rate at which
the two alternatives are equivalent
¤ That is the rate of return they are
offering you on your million
¤ So if it is more than the interest rate
you can borrow or lend at, accept, if less, reject
¯ A firm is planning to build a million
dollar factory
¤ Which will make the firm $100,000/year
for eight years
¤ Then collapse into a pile of dust
¤ The internal rate of return is the
interest rate at which it is just worth doing
¤ Or in other words, the rate of return the
project gives the firm on its million
¤ Decide whether to build it according to
what the firm's cost of capital is.
v Predictable irrationality aka behavioral
economics aka evolutionary psychology
¯ Economists generally assume individually
rational behavior
¤ Meaning that individuals have objectives
and tend to take the actions that best achieve them
¤ This makes sense to the degree that the
rational actions are predictable
¤ The mistakes are not, so treat the as
random error
¯ There is some evidence, however, for
certain patterns of "irrational" behavior
¤ Endowment effect
¤ Not discounting the future the way
economists think you should
á
Would you
rather have $100 today or $110 in a week? Many choose today
á
Would you
rather have $100 in a year or $110 in a year + a week?
á
Few choose
the $100
¯ Evolutionary psychology as an alternative
to economics
¤ Similar pattern—act as if making
the best choices for an objective
¤ But in evolutionary biology, we know the
objective—reproductive success
¤ And evolution is slow, so we are adapted
not to our present environment but to the environment we spent most of our
species history in
¤ I.e. as hunter/gatherers.
¯ The endowment effect and territorial
behavior
¤ Territorial animals have a territory they
treat as theirs
á
The farther
into it a trespasser of their species comes, the more desperately they fight
á
A fight to
the death is usually a losing game even for the winner, so É
á
On average,
the "owner" wins—the trespasser retreats
á
A
biological example of a commitment strategy in a bilateral monopoly game
¤ Think of the endowment effect as the
equivalent for non-territorial property
á
This is
mine, so I will fight harder for it than it is worth
á
Knowing that,
you won't try to take it away from me
á
We thus get
private property without courts and police
á
As long as
inequalities of power are not too great
¯ Uncertainty and trading off future
against present
¤ The environment we evolved in was risky
and short of mechanisms for enforcing long term contracts
¤ So we are designed to heavily discount
future benefits vs present benefits
¤ "A bird in the hand is worth two in
the bush"
¤ but not to heavily discount a year plus a
week over a year—both are future
¤ this also explains why we have to use
tricks to get ourselves to sacrifice present pleasure for future benefits
á
Christmas
club for savings
á
"I
won't have ice cream for desert until I have lost five pounds"
á
think of it
as a rational economic you trying to control a much more short sighted evolved
you
á
and facing
the usual agency problems in doing so
v "If you're so smart, why aren't you
rich?" The economist's answer
¯ Ways of making money on the stock market
and why they don't work
¤ Suppose a stock has been going up
recently.
á
Buy
it—it will probably keep going up?
á
Sell
it—it will go back down to its long term value?
á
If either
method worked—it wouldn't.
¤ There are a variety of more elaborate
strategies which involve analyzing how a stock has done over time, or how the
market has done, and using that information to decide whether to buy or sell
¤ People who do this are called
"chartists."
¤ The idea is reflected in accounts of what
the market did
¤ If it goes up and then down, that is
called "profit taking"—with the implication that when it goes
up it will go down.
¤ People talk about "support
levels" and "barriers" and similar stuff.
¯ Suppose lots of investors are
superstitious, so sell stock on Thursday the 12th, expecting
something bad to happen on Friday the 13th
¤ So the stock (particular firm or the
whole market, as you prefer) drops on or just before Friday the 13th
¤ What should you do if you know this and
are not superstitious?
¤ What will the consequences be
¤ Generalize the argument to any
predictable pattern.
¤ And you have the efficient market hypothesis,
weak form
¤ The argument also works for lines in the
supermarket or lanes in the freeway
v The efficient market hypothesis
¯ Is the formal version of my Friday the 13th
story
¯ You cannot make money by using past
information about stock prices to predict future prices
¤ For instance, by buying a stock when it
is below its long run average, selling when above
á
Because
lots of other people have that information
á
The fact
that it is below or above means other investors have some reason to think it is
doing worse or better than in the past
¤ This is the weak form of the
hypothesis—limited to price information
¯ You cannot make money by using other
publicly available information either
¤ Such as the information sent out to
stockholders
¤ Or the fact that demand for heating oil
goes up in the winter
á
Radio ads
telling you to speculate in oil futures on that basis
á
But oil
futures already incorporate that information in their price
¤ Or the fact that this is an unusually
cold winter—other people know that too.
¤ This is the semi-strong form of the
hypothesis—all public information is incorporated in the stock price
¯ All information is incorporated in the
stock price
¤ Cannot include information that nobody
knows—a meteor is going to take out the main factory next week.
¤ What about information only one person
knows?
¤ A handful of people?
¤ Does it depend on who the handful are and
what the legal rules are?
v Why the hypothesis cannot be (perfectly)
true
¯ If even the weak form were perfectly
true, and individuals knew it
¤ There would be no incentive to look for
patterns in stock movements
¤ And if nobody is looking, the mechanism
that eliminates the patterns doesn't work
¯ Consider the analogous problem with
grocery store checkout lanes
¤ You have an armful of groceries, are at
one end of the store—should you search all lanes to find the shortest?
¤ No—because they will all be about
the same length, because É
¤ If one is shorter than the next, people
coming in between them will go to the shorter, evening them out.
¤ The efficient market hypothesis. But É
¯ Two limits to it
¤ If it were perfectly true, nobody would
bother to pay attention to line length,
á
so it
wouldn't work.
á
Especially
since length includes how much stuff each person has in his cart
á
Which takes
some trouble to look at and add up
á
So, if
people are perfectly rational, the differences in length have to be just enough
to provide enough reward to those who do check to make enough people check to
keep the differences down to that level.
á
Who
searches? Those for whom the cost of doing so is lowest
¬ Because they are good at mental
arithmetic and
¬ Don't have an armful of groceries
¬ So you should go to the nearest lane.
¤ Not all information is public
á
If you know
that one checkout clerk is very fast and other people don't
¬ You go to her lane even if the line is a
little longer
¬ And benefit from your inside knowledge
¬ Until enough people know to bring her
lane up to the same length in time as the others
¬ At which point only insiders are in her
lane
á
What if you
know one is very slow and other people don't
¯ These limits explain
¤ Hedge funds and the like
á
Very large
amounts of money
á
Very smart
people working for them
á
In the
business of finding very small deviations from efficiency and eliminating them
á
At a
profit.
á
"Statistical
arbitrate"
¤ Explaining Warren Buffet
á
He claims
to be proof that the efficient market hypothesis is false
á
Because he
has done enough better than the market so that, by chance, not one such
investor ought to exist.
á
But then,
his ability to evaluate information might be extraordinarily good
á
Which
points out some of the ambiguity in the idea of publicly available information.
¯ At the individual level, the argument for
throwing darts at the Wall Street Journal doesn't work if either
¤ You have information nobody else has
á
The
checkout clerk in lane 3 is very slow
á
There is
construction coming up in the left hand lane of the freeway
á
The CEO of
the firm is an old college acquaintance, and you know he is a clever and
plausible crook
¤ You have an opinion you are willing to
bet on and know many other people will bet the other way
á
When the
first Macintosh came out, I told a colleague I was getting one
á
He asked
why I didn't get a PC Jr.
á
So I bought
stock in Apple
á
I have made
four investments on that basis.
¬ Three made me money
¬ One lost it
¯ But at the time I thought it was more
likely to lose money than make it
¯ But had a positive expected return.
¤ Which suggests two ways of making money
in the stock market
á
Knowing
enough about the firm to tell if it is over or underpriced—accounting+ or
É
á
Depending
on your special information
¬ And not bothering to know everything else
relevant to the firm
¬ Because the market will already have
incorporated all that into the stock price.
¤ The third way to profit isn't by making
money
á
If my wife
is an oil geologist, I should stay out of oil stocks
á
Or even sell
them short
v Exercise, which I will put on the
syllabus for you to think about
á What is economics?
á A way of understanding behavior
á Based on a simple assumption
á Rationality:
á Meaning that people have purposes and tend to take those actionsÉ
á Not a statement about how people think
á But about the consequences
á True of cats and babies
á Not entirely true, but É
á A lot human behavior fits that pattern, and É
á We don't have a good theory for the rest, so É
á Treat it as random error.
á In some contexts, truer than it ought to be
á Firms maximizing profits
á Large markets where random effects cancel out
á What does it apply to?
á All behavior in all times and places
á My size of nations
á Economic Analysis of Law:
á Armed Robbery
á Contracts under duress
á Mugger--argument for enforcing
á Parole system in warfare--argument against
á Pinochet--argument in both directions.
á Politics, marriage, war, É .
á Rational ignorance. Name of congressman?
á Armies running away. Njalsaga.
á Silent student problem
á Divorce rate?
á We find out by trying
á Conventional area of applications
á Explicit markets, prices, inflation, unemployment, etc.
á Ideas best worked out in those areas, so we will spend most of our time there, but É
á With detours to apply the ideas elsewhere.
v The
coordination problem
¯ Why
our society cannot exist and we must all be dead
¤ In
order to achieve almost anything—produce food, build houses, make clothes
á
We require the coordinated cooperation of millions of
people
á
The house requires, among many other things, wood
¬ Which
requires people growing trees and cutting them down
¬ Which
requires people making chain saws
¬ Which
require people making iron and steel and gasoline
¬ Which
require É
¤ There
are, broadly speaking, to solutions to the coordination problem
á
Central direction—someone tells everyone what to
do
¬ Which
works on a small scale\
¬ But
becomes hopelessly unworkable at the scale of a national economy
á
Decentralized
coordination via prices, voluntary exchange, etc.
¯ How
should we judge alternative solutions to the coordination problem
¤ As
embodied in legal rules
¤ Government
policies
¤ É
¯ one
way is by their net effect on everyone, which we can think of as the "size
of the pie."
v Perfect
competition
¯ A
simplified model of how the exchange market works
¤ Infinite
number of participants, so each participant ignores the effect of how much he
buys, sells, produces, consumes on the market price.
¤ Complete
information
á
So if you are willing to pay $2 for something
á
It must be worth at least $2 to you—or you
wouldn't have.
¤ All
transactions are voluntary
á
No theft, or É
á
Torts, or É
á
Involuntary interactions not covered by law or
contract, such as my playing my car stereo so loudly that it bothers other
drivers.
¤ In
explaining perfect competition one often makes additional simplifying
assumptions, then drops them later.
á
You can find a much more extensive version in my Price
Theory text, webbed on my site
á
Or my _Hidden Order_, which I will put a copy of on
reserve.
á
In both I work through the simplified version, then put
the complications back in.
¯ A
pretty good approximation for some but not all market settings
¤ One
big advantage over more complicated models is that we can solve it
¤ Prove
theorems about the outcome, in particular.
¤ One
can prove that it maximizes the size of the pie in a useful although not
perfect sense
¯ Which
means that one can look for ways of increasing the size
¤ By
seeing where the assumptions break down
¤ And
how those breakdowns reduce net benefit to people.
v Demand
and supply curves
¯ A
demand curve shows quantity demanded as a function of price
¤ In
casual conversation, I "want" X amount of something
¤ But
in fact, how much I want depends how much it costs
¤ Because
what I am really choosing is not "an ice cream cone" but
á
+ the pleasure from consuming an icecream cone
á
-the value to me of the money I have to pay for it
á
-the cost to me of the calories I get from it
¤ and
whether that nets to plus or minus depends, among other things, on the price.
¯ As
price changes, I move along my demand curve—choose to consume more if the
price goes down, less if it goes up
¤ Adding
up individual demand curves, we move along the market demand curve
¤ Which
is the horizontal sum of individual demand curves
¤ Since
the amount we buy is the sum of what I buy and you buy and É
¯ A
shift in the demand curve changes the relation between price and quantity
¤ Something
happens which makes me willing to buy more (shift right) at any price
¤ Or
less (shift left)
¯ Economists
distinguish between
¤ a
change in demand (demand curve changes)
¤ and
a change in quantity demanded (quantity changes, whether because the curve
moved or because the price changed with the curve staying the same)
¤ and
that distinction avoids a lot of verbal confusion.
v How
much is it worth to me to be able to buy all the apples I want at $1/apple?
¯ Suppose
I am willing to pay $3 to have one apple instead of zero.
¤ I
am paying $1 to get something I value at $3,
¤ so
gaining $2
¤ my "consumer surplus" on the
first apple
¯ suppose I am willing to pay $2.50 to have
two apples instead of one
¤ My surplus on the second applie is $1.50
¤ So my total surplus on the two aplles is
$3.50
¯ But "willing pay $3 to have one
apple instead of zero"
¤ Means that at a price of $3 I would buy
one apple
¤ So my demand curve shows a quantity of 1
at a price of $3
¯ Following out the argument, my consumer
surplus on buying all the apples I wish to buy at $1/apple is the area
¤ Under my demand curve and
¤ Above a price line at $1/apple
v The same argument applies to a supply
curve
¯ It shows how much a producer will produce
and sell
¯ If I would produce a unit for any price
above $1
¯ And can sell it for $3
¯ My producer surplus, aka
"profit," is $2.
¯ Generalizing that argument, producer
surplus is
¤ The area above the supply curve
¤ And below the price.
v So total surplus from a particular market
is the area between supply and demand curves
v And the net effect on individual welfare
of a change is the change in that area.
¯ But note that this assumes the ordinary
market setting
¯ And we are about to see some problems
with that assumption
v Suppose we have price control on gasoline
¯ The market price would be $2/gallon, at
which quantity supplied equals quantity demanded
¯ Instead the law fixes it at $1/gallon
¯ The book's version of what happens
¯ What is wrong with this story?
¤ At the price, consumers want to buy more
than producers want to sell
¤ What decides who gets the gasoline?
¯ Simple answer: whoever gets to the gas
station and before they run out.
¯ So lines start to form at gas stations
¯ How long does the line have to be for
quantity demanded—at a price that includes the time waiting in
line—to equal quantity supplied?
¯ What is the overall effect on surplus.
v Government regulation over professions
¯ The arguments that are given in the book
all assume a philosopher king government
¤ The government is trying its best to do
good, but doesn't always succeed
¤ Is there a more plausible model?
¯ Facts on medical licensing history
¤ During the five years after Hitler came
to power, about the same number of foreign physicians were admitted to practice
as during the five years before
¤ During the Great Depression, the AMA
informed medical schools that they were graduating too many students
¤ Every school cut back
¤ Medical licensing normally requires
graduation from an approved School
¤ Where do you think the states got their
list of approved schools?
¯ Licensing vs certifying
¤ If the problem is that consumers don't
have the information, certifying is sufficient
¤ The argument for licensing is that, even
with information, the consumers will make the wrong choice.
¤ Or that a large part of the cost from
using a low quality professional is born by someone other than the person
making the decision
á
I have a
building designed by an incompetent architect
¬ It falls down, injuring other people
¬ Whom I cannot compensate for their loss
¬ But—as in moral hazard in general,
although I don't have the full incentive, I have a substantial incentive, since
if the building falls down I lose a lot of money
¬ And my mortgage company has an incentive
too
á
I use an
incompetent physician
¬ Don't get cured
¬ Spread my (contagious) disease
¬ But again É
¤ The argument against is that licensing
can be used to control consumers in someone else's interest, usually the
profession
á
Not only
physicians and lawyers are licensed, but also
á
Yacht
salesmen and egg graders and barbers and É
v Monopoly
¯ What?
¤ A single firm that produces almost the
total output for a market
¤ And so can control price--at the cost of
selling less the higher the price.
¤ Of course, a firm in a competitive market
can ask any price it wants too--but above the market price, its sales drop to
zero.
¯ Why?
¤ Only owner of a required input
á
A choke
point--the only pass through the mountains
á
De Beers?
¬ Demand story--that DeBeers created the
demand for diamonds for engagement rings. But É
¯ "In fact,
in 1938 some three quarters of all the cartel's diamonds were sold for
engagement rings in the United States." (Before the publicity campaign
started)
¯ unclear how much it is a
story of an ad campaign that sold diamonds, how much of one that sold itself.
¯ Competing explanation by
Margaret Brinig
¬ Do they control production?
¯ 1957, Soviet production 20-30% of world
¯ Australia, Angola, Canada, Zaire
(<3%),
¯ DeBeers mines "represented about
half of total supply"
¬ A monopoly on marketing, not production
¬ Cartel? Natural monopoly? Unclear.
á
Me. Or
Apple. Or your corner grocery store.
¬ More generally, the sole producer of a
particular variety of good
¬ Has some monopoly power wrt buyers who
want that variety
¬ And that sort of monopoly is much more
common, and probably more important, than the "giant firm controlling the
X industry" type.
¯ Natural monopoly
¤ Economies of scale
á
Sometimes
increasing quantity reduces cost per unit, because fixed costs such as design
or tooling can be spread over more units
á
Sometimes
it increases cost, because the larger firm has more layers between the
president and the factory floor
á
So average
cost typically first falls with output, then eventually comes up again.
¤ If it keeps falling out to the full
extent of the market, you have a natural monopoly
á
Since it
can make money selling at a price at which a smaller competitor would lose
money
á
It ends up
with the whole market
¤ The common case of this is the
specialized producer mentioned above
á
Where the
cause is not the large economies of scale, but É
á
The small
size of the market
á
A small
town general store, me as a speaker, your favorite author
¤ But it could also exist on a large scale
if there are very large economies of scale.
¯ Artificial monopoly: Predatory Pricing
and The Standard Oil Myth
¤ "Big firm has deep pockets, sells
below cost to drive out smaller firms"
á
both firms
are losing money, but the big firm has more money to lose, so lasts longer
á
what is
wrong with this story?
¤ McGee article in JLE
á
He went
through the many volume transcript of the Standard Oil antitrust hearing
á
And found
no examples of predatory pricing, actual or claimed
á
The closest
was a threat to Cornplanter oil to cut prices on them if they didn't stop
expanding at Standard's expense
á
The manager
of Cornplanter, by his testimony, told the Standard Oil man that if they cut
prices on him he would cut prices over a much larger area, costing them a lot
more money.
á
And that
was the end of the matter.
¤ Hard to prove that predatory pricing is
impossible
á
Given the
nature of game theory and commitment strategies
á
But the big
firm seems to have the weaker hand in the game
á
Cheaper to
try to buy out competitors—which Rockefeller did. But that has a long run
problem
v State enforced monopoly
¯ The original meaning of the
term--monopoly privilege sold to raise money or given to favored subjects.
¯ Still a common form of monopoly
¤ It is illegal to compete with the Post
Office in the delivery of first class mail
¤ It is illegal to sell liquor in states
with a state liquor monopoly
¤ Until deregulation, the airlines were a
cartel enforced by the CAB
á
It was
illegal to change fares in either direction without permission
á
Or to start
flying a new route without permission
á
PSA story
¤ Professional licensing is a form of state
monopoly
¤ As are patents and copyrights
v What is wrong with monopoly?
¯ Consider a simple case
¤ Fixed cost of a million dollars a year to
operate a factory
¤ Marginal cost of a dollar widget to
produce widgets
¤ If you produce a million widgets/year,
average cost = $2
¤ If you produce two million, $1.50
¤ And so on out to infinity É
¯ What price maximizes the firm's profits?
¤ The lower the price, the more units they
can sell
¤ Suppose they sold widgets for a dollar/widget--their
marginal cost
¤ They lose a million dollars a year
¤ Raising the price and cutting quantity
has to be a win
¤ When do you stop raising the price?
¤ consider marginal revenue--increase in
revenue by selling one more unit
á
If marginal
revenue <marginal cost, you are losing money on the last unit sold
á
So should
cut back until you reach the quantity where MR=MC
á
Any further
cut costs you, since you are cutting a unit that brings in more than it costs.
á
Selling one
more unit/year gets you the price it sells for
á
Costs you
the reduction at the price at which all others units can be sold, since it
takes a slightly lower price to increase sales
á
So marginal
revenue is less than the price
¯ If we consider the combined effect on
seller and buyer, what price "should" the seller sell at?
¤ As long as price is above marginal cost
¤ There are consumers who value the good at
more than it would cost to produce one more unit for them
¤ But are not getting it
¤ Producing that extra unit and giving it
to the consumer would benefit the consumer by more than it cost the producer
¤ So the efficient rule is to sell at a
price equal to marginal cost
¯ So a monopoly maximizes its profit at a
higher price than the price that maximizes net benefit to customer plus firm.
¤ This is the standard economic argument
for why a monopoly is inefficient
¤ And evidence of the risk of assuming
technical terms have their common meaning
¤ Since the hearer will imagine the
statement is about how badly run a monopoly firm is
¤ When it is actually about how a well run
monopoly firm will act
¯ A second inefficiency: Rent seeking
¤ My railroad story
¤ The exchange control version
¤ The tariff version
¤ The price control version
¤ What's wrong with theft
¤ Gordon Tullock, "The Welfare Costs of Tariffs, Monopolies and Theft"
v Oligopoly and cartel
¯ An oligopoly exists when
economies of scale are large enough so there are only a few firms in the
industry
¯ They might compete, trying to
take account of the effect each has on prices
¤ We could model this as a Nash
equilibrium
¤ Where each takes the behavior
of the others as given
¤ And decides what price and
quantity maximizes its profits
¯ Or they might coordinate, act
as a cartel, all agreeing to hold down output in order to drive up prices
¤ In the U.S. at present, this
is illegal
¤ Even if legal, each member of
the cartel has an incentive to chisel—cut prices a bit under the table in
order to lure customers from the others
¤ Unless the agreement is not
only legal but enforceable
v Monopolistic competition
¯ Lots of firms, differing in
location or characteristics of the product
¯ Each one has a partial
monopoly wrt customers "close" to it
¯ Competes for customers who
are near two or more firms
¯ And if the outcome is above
market profits, additional firms can enter the market
¯ Driving profits down.
v What to do about natural monopoly?
Three alternatives, all bad.
¯ Government monopoly--the post
office
¤ What is its incentive to
charge marginal cost?
¤ Or to hold costs down
¤ When some "costs"
can be used to buy political support
¤ For instance from employee
unions
¤ Or other sellers of inputs
¯ Regulated monopoly: Public
utilities and the like
¤ If forced to charge marginal
cost they go broke, so require a subsidy
¤ In practice the usual attempt to force
average cost--still inefficient, but not as inefficient as what the monopolist
would do
¤ But how do you know what average cost is?
á
If you ask
the accountants, the firm has no incentive to hold down cost
á
Since
whatever it is, that's what they are allowed to charge
¤ And why is it in the interest of the
regulator to try to maximize social benefit anyway
á
When he
could be currying favor with the regulatee in exchange for a well paying job
when he leaves government service
á
Or buying
votes for his political patrons at the cost of either the regulatee's
stockholders or its customers
¯ Private monopoly
¤ Has the inefficiencies already discussed,
but two advantages
¤ It does have an incentive to minimize
costs, and É
¤ It doesn't have the ability to use the
government to prevent competition
¤ Which matters if change over time makes
it no longer a natural monopoly
¤ Consider, in contrast, the history of the
ICC
á
Set up to
regulate the (somewhat) monopolistic railroads
á
Regulated
the competitive barge industry, and later trucking industry
á
In part to
protect the railroads from their competition.
v Other forms of monopoly
¯ The standard argument shows
that competition is superior to monopoly, where competition is
possible—i.e. not a natural monopoly
¯ So is an argument against
government enforced monopoly
¯ Or monopoly created by one
firm buying out is competitors.
¤ But the antitrust division
has to distinguish that case
¤ From one where firms merge to
make a better firm
¤ So ask the other firms in the
industry
¤ And do the opposite of what
they tell you to.
v Price discrimination
¯ The problem with inefficiency
from the monopolist's point of view
¤ All those customers out there
who would buy my additional units at more than it costs me to produce them, but
not at my price
¤ How can I make money off them
too?
¤ By finding some way of
charging different prices to
á
Different
customers, or É
á
The
same customer for different units.
¯ Consider a cookie company
with identical customers, MC = .10/cookie
¤ $1/cookie up to 10/week, then
.50/cookie
¤ can do better with a much
more complicated pricing system
¤ better still with the cookie
club
¤ but É how do we prevent
resale?
¤ Cookies can only be consumed
on the premises???
¤ Works better for electric
power, or transportation, or medical services
¯ Consider an author and his
publisher, with two kinds of customers
¤ Fans will pay $20 for the
book
¤ New readers will pay $10
¤ Solution?
¤ What if the two kinds are
rich Americans and poor Englishmen?
¤ Other examples
¯ Is price discrimination a
good or bad thing?
¤ Perfect price discrimination
eliminates the classical inefficiency, but É
á
Increases
monopoly profit and so
á
Expenditures
on getting it
¤ Imperfect price discrimination may reduce
the classical inefficiency
á
Also may
make possible the production of goods that could not cover their cost if all
sold at the same price
á
But if you
divide your market (U.S./England) a book might go to an Englishman who values
it at $11 instead of an American who values it at $19, which is inefficient
á
And there
are costs to doing the price discrimination
¬ Which you pay because you get a benefit
¬ But it may come at the expense of the
customer who you are getting to pay a higher price, in which case on net it
makes the two of you worse off
¬ Or it may permit a customer to get it who
otherwise wouldn't, in which case it makes the two of you better off
¤ So the net effect is indeterminate
á
Price
discrimination might on net make us better off
á
Or worse
v Externalities
¯ The simple solution to the coordination
problem
¤ Everyone bears the costs of his own
actions
¤ Pays enough for inputs (labor, raw
materials) so that the sellers don't lose by selling them to
him—otherwise they wouldn't.
¤ Charges a price for his output at which
the buyers don't lose by buying—otherwise they wouldn't
¤ If his income minus his costs are
positive then
á
He
produces, and É
á
In terms of
net effects, should produce
¤ So we have a whole lot of tiny decisions,
coordinated through the price system
¤ For more details, read a price theory
textbook, prefereably mine.
¯ Why it doesn't work with externalities
¤ My action imposes a cost on you which
doesn't require your permission
¤ So I ignore that cost, take the action
even if my gain is less than your loss (negative externality)
¤ Or my action produces a benefit I can't
charge you for, and I might fail tot ake it even if your benefit was larger
than my loss. (positive externality)
¤ Note that the problem isn't the existence
of the external cost
á
Internal
costs are costs too
á
The problem
is that because the cost is external, it leads individuals acting rationally to
make the wrong decision
á
And
individual rational action is what we mostly rely on to get the right decision
made.
¯ The Regulatory solution
¤ Have a government agency decide what I
ought to do, taking into account all costs external and internal
¤ And order me to do it
¤ Filter my factory's smokestack, for
example
¤ This has two problems
á
It requires
information about the costs of alternatives that the regulator probably doesn't
have and the firm has no incentive to tell the truth about
á
The
regulater may find better things to do with his power than improve the world
¯ The Pigouvian solution
¤ Tax me the amount of my externality
¤ Now I will take account of it in my
decisions
¤ Filter the smokestack if and only if
doing that is the cheapest solution
¤ And costs less than the resulting
reduction in pollution is worth
¤ Obvious problems
á
Whoever
sets the tax needs to measure the damage done by the pollution—so it
requires less information than direct regulation, but still some
á
Again, the
incentive of those setting the tax
¯ Coase's criticism
¤ Nothing works
á
An
externality isn't a cost I produce for you, but É
á
A cost of
things both of us are doing that are incompatible
á
The physician
and the candy factory.
á
Airport
noise is only a problem because there are people living under the flight path
á
The lowest
cost solution might be to reduce the noise, or to have something other than
housing under the flight path
á
But if the
airlines must pay a tax for the noise, they may reduce it at a high cost, when
the alternative solution is less expensive—and would occur if the
airlines were not being taxed.
á
More
generally, the Pigouvian solution requires you to know which party is the lower
cost avoider of the problem—or whether the best solution is for both
parties to do some avoidance
¤ Everything works
á
If
transaction costs are low, the physician wins the case, but moving his
consulting room costs less than moving the machinery, É time to start bargaining
á
If the
airlines are not taxed for the noise, but noise reduction is the less expensive
solution
á
The
airlines will contract with the owners of the land under the flight path to
reduce the noise—at the owners' expense
á
Or buy the
initially cheap and empty land, then reduce the noise, then resell.
á
More
generally, if the outcome is inefficient, some change could produce benefits
for all concerned, so if transaction costs are low they will bargain to that
change
á
The right
to do X moves to whomever values it most highly
¤ It all depends
á
On
transaction costs
á
Organizing
a contract by which everyone who lives in Los Angeles pays everyone who drives
there to reduce his pollution would be hard
á
So where
transaction costs are high, we are back with the Pigouvian problem.
v Externalities and the law
¯ EPA style regulation is the first
approach—give orders about outcomes
¤ Require all power plants to remove X% of
the sulfur from what goes up the smokestack
¤ Giving no incentive to use low sulfur
coal instead.
¤ There are states that produce high sulfur
coal. They have senators.
¯ The tort system can be viewed as a
version of Pigouvian taxes
¤ I impose an externality, you sue me
¤ I have to "make the damage
whole," i.e. pay the amount of the damage
¯ Control of air pollution via tradable
emission permits can be seen as
¤ Pigouvian approach, with the government
deciding the amount of pollution, the market deciding the price
¤ The Coaseian approach, with the right to
pollute moving to those who value it most highly
¤ But without any possibility of bargaining
between polluters and breathers.
v Public goods: Does not mean a good
produced by government. Government produces some private goods, many public
goods are privately produced.
¯ Two characteristics usually used to
define a public good
¤ Non-excludability. If the good is
produced, it will be available to all the members of a preexisting group of
people
á
Consider an
(unencrypted) radio broadcast
á
Or my
repainting my shabby looking house
¤ Non-rival. One person's use of the good
doesn't interfere with another person's
á
The cost of
the radio broadcast does not depend on how many people choose to listen to it
á
My neighbor
enjoying the improved appearance of the neighborhood doesn't keep passing
drivers from admiring my handsome house
¤ In my view, it is the first characteristic
that is responsible for the essential characteristics of a public good
á
A good that
is only non-rivalrous is simply a natural monopoly with marginal cost of zero
á
A good that
is non-excludable raises the problems we will discuss, even if my use does to
some degree interfere with yours
¯ The public good problem: How to pay for
producing it
¤ I propose to build a flood control dam,
cost $1 million dollars
¤ The protection is worth $10,000 each to a
thousand farmers downstream
¤ So the dam is worth building, but É
¤ How do I get them to pay for it?
á
Each farmer
knows that his payment is unlikely to make the difference
á
And if the
dam is built, he benefits even if he didn't contribute
v Possible solutions
¯ private
¤ Consider radio and TV—an obviously
insoluble public good problem that is routinely solved, since both are produced
privately
á
The
ingenious solution is to produce two public goods
¬ One with a positive production cost and
positive value to consumer
¬ One negative and negative
¬ And tie them tightly together
á
The negative
one pays the broadcaster's costs, the positive one gets the listeners
¤ Consider my dam story with ten farmers,
and a $50,000 dam
á
I draw up a
contract by which each farmer agrees to pay $5,000
á
If and only
if every other farmer does
¤ Consider the story with 100 farmers, of
which two are large farmers who gain by $40,000 each.
á
They might
agree to pay for the dam, letting the rest free ride—it's still worth it
to them.
á
Which is
how my house gets painted—it's worth it to me, even if my neighbors get
some of the benefit
á
The big
farmers, or homeowner, are a "privileged minority."
¤ Technological solutions
á
Sometimes,
whether a good is public depends on how you produce it
á
Radio or TV
broadcasts could be encrypted, for instance
á
In a world
without copyright, you could keep books in libraries chained to the desk,
charge for access
¯ Legal
¤ To some degree, whether something is a
public good depends on the law
á
Without
copyright law, the author produces a public good
á
One could
make it illegal to listen to a radio frequency without having paid the
broadcaster
¬ Such a law would be costly to enforce,
but É
¬ England actually has a version of it (I
think one payment for all TV listening)
á
In open
range, grazing land is a commons, so a public good
¬ Converting it to private property makes
it a private good
¬ But again, there may be costs
¯ Government production
¤ How we produce national defense
á
Book's
claim that it wouldn't be privately supplied is an exaggeration
á
The
Commanche, with no government at all, blocked expansion across Texas for a couple
of decades
á
But it
clearly is a hard public good problem
¤ Problem with public production: Usual two
á
It's hard
to do it right
¬ How do you find out what the optimal
quantity and quality of the public good are? No market feedback. Ask the
customers—they want lots of it, at someone else's expense.
¬ How do you find out if it is worth
producing at all? Cost might be higher than benefit
á
It may not
be in the interest of those making the decisions to do it right
v Public goods as a problem in the
political market
¯ Rational ignorance
¤ The simple model of democracy requires
each voter to know what his representatives are doing, if they should do it,
vote them out if not.
¤ That requires a lot of expensive
information
¤ And acquiring that information means
producing a public good for everyone in the country—a very large public
¤ So most of us don't
¯ The market for legislation
á
Legislation
that benefits all members of an interest group is a public good for the
members.
á
A
concentrated interest group can do a better job of solving its internal public
good problem than a dispersed interest group
á
So
legislation tends to benefit concentrated groups at the expense of dispersed
groups.
v Again Transaction costs
¯ Some are the public good problem
¤ If homeowners pay the airline to reduce
noise,
á
all
homeowners benefit
á
Including
those who didn't contribute
á
So why
should I chip in?
¤ For a small public, such problems might
be soluble via conditional contracts and the like
¤ So this problem becomes larger the larger
the numbers involved
¯ Some are holdout problems
¤ Suppose the homeowners have the right,
and the airline can be enjoined from making noise by any one of them
¤ Further suppose reducing noise costs $10
million, soundproofing 100 houses costs $1000 each
¤ What happens?
¯ Even with small numbers of people, there
can still be problems associated with bilateral monopoly bargaining
¤ Actual mistakes about how much it is
worth to the other party
¤ Or bargaining breakdown because both
parties are trying for most of the gain
¯ In assigning rights via the legal system,
there are at least two important questions
¤ Who has the right—airline to make
noice, homeowners not to be bothered by noise
¤ Is the right injunctive or for damages?
¤ Think about how we might decide
¤ All of which is getting into law and econ
stuff—more of which is coming up.
v Welfare economics
¯ The problem: Questions economist get
asked--"Should we do X?"
¤ "Is strict liability better than
negligence?"
¤ "Are tariffs good for America?"
¤ "Should we restrict
immigration?"
¯ we respond with "economic
efficiency"
¤ a criterion of goodness that has two
characteristics
á
it
corresponds to a significant degree to what people are asking
á
and, if we
rephrase questions in terms of it, we can often answer them
¤ but the correspondence is very imperfect
¯ defining efficiency
¤ This is my version, based on Alfred
Marshall
á
It isn't
how most textbooks put the argument
á
But it is
what most economists actually do
¤ Measure all costs and benefits by
willingness to pay, sum them
á
If I would
pay up to a dollar to get an apple, the value of my getting it is $1
á
If you
would pay up to fifty cents, É
á
So
transferring the apple from you to me increases total benefit by $.50
¤ Anything that increases the total is an
economic improvement, anything that decreases it is a worsening
¯ What is wrong with it:
¤ It accepts individual preferences as
defining value—heroin is of value to a heroin addict, as shown by the
price he will pay
¤ It does interpersonal comparisons as if a
dollar represented the same happiness to everyone—rich and poor,
materialist and ascetic
¤ It assumes that all that matter is
something like human happiness
á
What if
cutting down the oldest tree in the world benefited humans
á
Or
eliminating a species
á
Or
suppressing a great novel that made its readers sad
¯ On the other hand
¤ A lot of economic issues have a large and
random group of beneficiaries, similarly of losers, so individual differences
may average out
¤ If you believe that taking a dollar from
Gates to give to a beggar is a good thing, perhaps an efficient legal system to
maximize the pie combined with a straightforward tax and transfer system to
redistribute it makes sense
¤ Dollar values are expressed in individual
actions, so we actually have a way of maximizing them
¤ On a free market, the apple ends up with
whomever values it most
¤ No mindreading required.
¯ Do read the book's discussion, since it
is rather different from mine.
Law and Economics
v What is it
¯ You are in a society where life
imprisonment is the highest punishment
¤ Someone proposes life for armed robbery
¤ Is it cruel and unusual? Is it just?
¤ The economist asks
á
Do you
really want armed robbers to kill their victims
á
Because if
the punishment for armed robbery and armed robbery+murder are the same
á
The
additional punishment for murder is zero
á
And a dead
man can't identify you in a police lineup
¤ The economic approach assumes individuals
are rational, deduces the effect of legal rules, and lets you decide if you
like that effect or not.
¯ You take a particularly good opportunity
to push your rich uncle off a cliff
¤ Unfortunately for you, a bird watcher has
his camera pointed your way
¤ At the trial, you confess to the murder,
but É
¤ Your attorney points out that
á
It was an
unlikely opportunity
á
You only
had one rich uncle
á
If you had
another, he is unlikely now to go mountain climbing with you
á
So no need
to punish you--it won't happen again.
¤ What is wrong with this argument?
¤ The economic approach as forward looking,
not backward looking
á
Not
primarily concerned with how to clean up this mess as an end in itself
á
But how the
way we resolve this situation will affect acts that might lead to similar
situations in the future
v Property
¯ Why does the institution exist?
¤ To the non-economist, the obvious
question is why anything should be property
á
Why not
just let anyone use things when he needs them?
á
None of
this "mine and thine" nonsense
¤ To the economist, why anything should not
be.
á
Property
solves two problems
¬ Getting the right things made
¯ If you make it, it is yours
¯ So if it is worth more than it costs to
make, you gain by making it
¯ Which is the right rule for what ought to
be made
¬ Getting things taken care of
¬ Allocating things to the right person
á
So why not
use it for everything?
¬ How about the English language?
¬ The first inventor of a word owns it
¬ And anyone who wants to speak must get
licenses for all the words
¤ The approximate optimality of aboriginal
property rights
¤ There are advantages to treating things
as property, as above, but also disadvantages
á
There are
costs to defining, protecting, transacting over property
á
If the
advantages are small and the costs substantial, we may be better off treating
something as a commons
á
Putting up
with some inefficient production and allocation
á
In exchange
for saving those costs
á
For
instance the English language
¤ Why we owe civilization to the dogs--a
possibly true fable
¯ I.P. as an example
¤ Why are patent and copyright so
different? The Constitution doesn't propose different rules
á
Copyright
is given easily, for a long period of time
á
Patent
grudgingly, for a short period of time
á
Why?
¤ Copyright protection (old style--not
modern look and feel etc) protects things with clearly defined borders, where
accidental trespass is unlikely and independent creation still more unlikely
á
Long terms
don't over reward, because even after fifty years nobody else would have
written your book
á
Enforcement
isn't costly (but that is now changing with technology)
¤ Patent protection protects ideas--fuzzy
boundaries, accidental trespass likely, independent invention likely
á
Various
requirements try to limit protection
¬ to inventions not likely to be made
tomorrow by someone else
¬ Such as novelty and nonobviousness
á
Costly to
enforce--litigation over fuzzy boundaries, costs of checking everything you
invent to make sure nobody else has patented it
á
Term should
be and is short, because after a while someone else would probably have invented
it
¤ So I.P. fits our sketch of why some
things are more suitable for property protection than others
v Property II: What's in the bundle
¯ Ownership of land gives you some rights
wrt the land, but not all
¤ Can prohibit trespass, uses, but not
overflights,
¤ May or may not control mining under
¤ Or pumping oil somewhere else that makes
the oil under your land flow away to someone else's reservoir
¯ How might we decide what to include?
¯ Coasian analysis
¤ Minimize the sum of inefficiencies from
the wrong person owning a right
¤ And transaction costs of moving rights to
the right person
¤ By starting with a bundle containing
rights that are likely to be most valuable to the owner of the other rights in
the bundle
¤ The right to farm is worthless without
the right to go on the land, of little value without the right to exclude
trespassers
¤ The right to control high overflights, on
the other hand
á
Is not of
especial value to the land owner
á
And if each
landowner has it for his land, rebundling to make airline flights possible is
very costly
á
Similarly
for the right to broadcast radio signals over the land
¯ Real world example: Pennsylvania
¤ A state made largely of coal
¤ Land consists of three estates
á
The surface
estate
á
The mineral
estate
á
The support
estate
¤ They can be separately owned
á
Under some
circumstances the support estate is most valuable to the surface owner
á
Under some
to the mineral owner
á
So they can
contract to move it to the one who values it more
v Torts I
¯ Optimal level of precautions
¤ We want legal rules that induce people to
take precautions against accidents
¤ Up to the point where an additional
precaution costs as much as it saves
¤ So there is an "optimal level of
auto accidents" above zero
¤ Given that preventing such accidents is
costly
á
Might mean
spending more on better cars
á
Driving
more slowly
á
Driving
less
á
É
¯ One sided injury
¤ A driver injures a pedestrian
á
Only the
pedestrian is injured
á
And
(assume) there is nothing he can do about it
á
So our
objective is to get the driver to take the optimal level of precautions
¤ Strict liability
á
The driver
must pay damages equal to the damage done
á
So he bears
the entire cost
á
So it is in
his interest to take all precautions that are worth taking
¤ Negligence (economist's version)
á
The driver
pays the damages if and only if he did not take at least the optimal level of
precaution
á
So either
he takes the optimal level of precaution and is not liable
á
Or he
doesn't, and is liable, and if he is liable, as we just saw
á
It is in
his interest to take the optimal level of precautions
á
So the
outcome is the same. Except É
¤ This assumes the court can judge what
precautions he took and should have taken
á
Divide
precautions into two categories
¬ Observable—how often does he have
his brakes checked
¬ Unobservable—how attentive was he.
Did he really need to take this trip?
á
In
practice, the negligence rule can only apply to observable precautions
á
So a
rational driver takes the optimal level of observable precautions, will not be
liable, and has no incentive to take the unobservable precautions
á
In the
literature, "unobservable precautions" are often called
¬ Activity level
¬ Since how much you drive might be
observable, but É
¬ The court has no way of knowing how
important trips are for you
¬ So in practice ignores the question of
whether you are driving more than you ought, so increasing the risk of
accidents more than you ought.
¯ Two sided injury
¤ Two sided in causation, not effect. Only
the pedestrian is injured, but the risk depends on his precautions as well as
the driver's
¤ Strict liability on the driver means that
á
The driver
has an incentive to take the efficient level of precautions, but É
á
The
pedestrian, who will be fully compensated, has no incentive to take precautions
at all
¤ No liability gives the opposite
result—incentive for the pedestrian, who bears the risk, none for the
driver
¤ Negligence on driver, if all precautions
are observable, however É
á
The driver
has an incentive to take the optimal precautions, so É
á
If there is
still an accident the driver is not liable, so É
á
The
pedestrian has an incentive to take optimal precautions too
á
In effect,
we are controlling the driver by a top down mechanism
¬ The court decides what he ought to do
¬ And punishes him by making him liable if
he doesn't do it
á
And
controlling the pedestrian by the standard decentralized mechanism
¬ He bears the cost, so
¬ It is in his interest to take it into
account in his precautions
á
Works
imperfectly if not all precautions are observable, courts make errors, etc.
¤ Another alternative—make each of
them bear the full cost
á
Fine the
driver the amount of the damage done—gives him the incentive
á
Don't give
the fine to the pedestrian—he still has the incentive
á
Of course,
in that system, neither has an incentive to report the accident
á
We have
converted from a tort to a criminal solution to the problem
¤ Yet another—divide the cost between
them
á
This is
like coinsurance.
á
Driver
bears (say) half the cost
¬ So he doesn't have an incentive to take
all precautions worth taking
¬ But he does have an incentive to take the
ones most worth taking
¬ I.e. the ones where the payoff is much
larger than the cost
á
And
similarly for the pedestrian
á
This may be
how tort works in practice
¬ Most pedestrians would prefer not to be
hit, even if they could sue
¬ Because the tort system, arguably,
doesn't fully compensate
¯ Strict liability with a defense of
contributory negligence
¤ Works just like negligence liability
¤ With the roles of driver and pedestrian
exchanged
v Torts II: What does causation mean
anyway?
¯ You see a friend walking along, stop him
to chat for a minute.
¯ He continues on. A barrel falls out of an
upstairs window on him and kills him
¯ Did you cause his death? If you hadn't
stopped him, he wouldn't have been under the barrel when it fell
¯ Should you be liable?
v Torts III: Caveat emptor vs caveat venditor
¯ A coke bottle explodes, back when they
were made of glass
¤ I was holding it and get injured
¤ Is coca cola liable? Should they be
¯ Under caveat emptor—"buyer beware"—I
take the bottle as I find it, risk included
¤ Is it in Coke's interest to take optimal
precautions to prevent defective bottles?
¤ Yes if the consumer is fully informed
about the risk, since I will pay less for a bottle the more likely I think it
is to explode on me
¤ No if the consumer doesn't have enough
data to estimate the risk
¤ So the argument against caveat emptor is that making Coke liable will give them
an incentive to take the right precautions
¯ Under caveat Venditor ("seller beware") Coke is
liable
¤ Why did the bottle explode?
á
The bottle
had been sitting on the table for three hours, and I was holding it when
shaking my hand to emphasize a point in my 4th of July speech.
á
And wasn't
wearing glasses
¤ More generally, the risk of accident
depends on how the product is used, what precautions the user takes.
¤ And caveat Venditor eliminates the user's incentives to take
precautions
¤ And it is far more difficult for coke to
know how careful each user is, and charge a higher price to more careless
users, than for the user to know how risk coke bottles, on average, are, and
base the price he is willing to pay in part on that risk.
¯ A third alternative is freedom of contract
¤ If the default rule is caveat emptor the seller can include a guarantee. As
many sellers do.
¤ If the default is caveat Venditor the buyer can sign a waiver of
liability.
¤ And have it enforced by the courts.
¤ So now it is the opinion of buyer and
seller of the costs and benefits to them of the alternative rules that
determines what actually happens.
v Contracts
¯ Much of this we have already done
¤ Why you want to design a contract to
maximize the size of the pie
¤ How to do it
¤ By looking at the incentives that the
contracts give the two parties
á
Ideally, to
make decisions that maximize the combined benefit
á
For
instance, a fixed price contract when quality isn't an issue
á
Which gives
the builder an incentive to minimize cost--including his time and trouble as
well as price paid.
¯ One interesting issue is punishment for
breach of contract
¤ We want efficient breach--breach when
completion costs more than it is worth
¤ But not opportunistic breach--breach that
benefits the breaching party by less than it injures the victim
¤ And we can use damage rules to try to get
that result
á
Expectation
damages: The breaching party must make the other party whole
¬ Meaning as well off as if there had been
no breach
¬ Again the Pigouvian solution: Breaching
party bears all of the costs of breach, gets all the benefit
¬ So breaches only if net benefit is
greater than net cost.
á
Reliance
damages
¬ I must make you as well off as if the
contract had not been signed
¬ But do not have to compensate you for the
gain you would have made from the contract being carried through
¬ So may have too great an incentive to
breach if there was a profit expected.
á
No damages?
¬ Still gets the efficient result in a
fully Coasian world
¬ Because when you threaten a breach that
benefits you by $1000 and hurts me by $2000
¬ I propose that I instead raise the price
I am paying you by $1500
á
Specific
performance?
¬ Still works in a fully Coasian world,
because
¬ When I refuse to permit a breach that
costs me $1000 and saves you $2000
¬ You offer to readjust the price by $1500
in my favor
á
The
disadvantage of the latter rules is the potential for bargaining costs and
bargaining breakdown
á
The
disadvantage of expectation damages (and reliance damages) is that a court has
to measure the costs
¤ A further issue is inefficient reliance
á
You agree
to buy a million customized widgets from me in six months
á
I can
produce them in my current factory at a cost of $3 million, or
á
Retool at a
cost of $1 million, then produce for an additional $1 million
á
My
retooling is a reliance expenditure
¬ I am relying on your buying the widgets
¬ And the million dollars is down the drain
if you don't
á
When should
I customize? When shouldn't I?
á
When will I
under expectation damages?
¤ A final alternative is liquidated damages
á
We agree in
advance on what damages for breach will be
á
If we know
enough when the contract is signed, we set them at expectation damages
á
What is the
effect on the incentive for inefficient reliance?
á
Courts may
refuse to enforce a liquidated damages agreement
¬ If they think it is a penalty clause
¬ I.e. that the amount is substantially
more than damage done
¬ Are there good reasons parties might want
a penalty clause?
¤ Consider (digression) the general issue
of property rights vs liability rights
á
We allocate
cars via property rights
¬ If you steal my car, we don't just bill
you for two day's use
¬ We impose a punishment intended to stop
you from doing it (but see more below)
¬ Perhaps on the theory that cars will then
be allocated via the market
¬ If it was worth that much to you, you
should have bought it from me.
á
We allocate
accident risk via liability rights
¬ If I dent your car, I don't get hanged, I
get sued
¬ And am supposed to pay enough to make up
for the damage
á
Could
imagine doing it the other way
¬ Thief must reimburse victim for actual
cost imposed
¬ Driver must get permission from everyone
he might dent before he pulls out of the driveway
á
Pretty
clearly, that wouldn't work as well, because É
¬ Markets allocate better than courts for
ordinary property, but É
¬ For imposing a low risk on many people,
transaction costs of using the market are prohibitive, so use the court instead
á
Note that
one version of property vs liability is the question of whether a tort verdict
should result in an injunction or damages.
¤ Back to liquidated damages
á
A penalty
clause is a privately agreed on property rule
á
"if
you want to breach the contract, you have to buy the right from the other
party."
á
The legal
system sometimes uses property rules instead of liability rules
á
Why
shouldn't parties sometimes find it in their interest to?
v Crime
¯ What is wrong with crime anyway?
¤ I steal $100 from you, I am $100 better
off, you are $100 worse off
¤ In terms of economic efficiency, why
should the rest of us care?
¯ If I, or anyone, can steal $100 from you
¤ We compete to be the one who does it--you
only have one wallet
¤ As long as the most energetic thief is
spending less than $100 on the job
¤ It pays someone else to spend more
¤ The opportunity to steal is an incentive
to rent seeking
¤ Which can dissipate the full amount
stolen
¤ So the marginal thief abandons a $10/hour
job to make $10.01/hour stealing, after allowing for all special costs and
benefits of the job
¤ Some thieves are better than average, so
benefit
¤ But there is an additional cost to
precautions taken by potential victims
á
Also rent
seeking, even though we approve of them.
á
Since the
lock on my door is an expenditure made to make sure I, not you, end up with the
$100
¯ Efficient crimes
¤ A hunter, lost in the woods and starving,
finds a locked cabin with a telephone and food
¤ Breaks in, feeds himself, calls the coops
¤ His gain is larger than the owner's
loss--an efficient crime
¤ Not true of most crime, because usually,
if I value your property more than you do I don't have to steal it, I can buy
it
¯ How might we arange to have efficient and
only efficient crimes?
¤ One solution is to special case
them--make them legal
á
The
starving hunter gets off under the doctrine of necessity
á
The traffic
cop notices your wife about to give birth in the back seat and doesn't give you
a ticket
á
But this
only works if the fact that it is efficient is observable from the outside
¤ The other solution is to punish them--but
not too much
á
Set the
punishment equal to the damage done to the crime's victim
á
So if my
benefit is greater, I will still commit the crime
á
Pigou again
¤ Criminal punishment is usually probabilistic
á
So instead
of setting punishment if convicted equal to damage
á
We set
expected punishment--roughly, probability of conviction times punishment if
convicted--equal to damage
¯ What is wrong with this (common in the
L&E literature) story?
¤ Do we believe that the reason we don't
raise the punishment for murder, or try harder to catch murderers, is that we
are afraid there will not be enough murders?
¤ We have counted costs and benefit to
criminal and victim, but ignored
¤ Costs and benefits of catching and punishing
criminals to the rest of us
¯ The cost of deterring crime is
¤ The cost of catching criminals, plus
¤ The cost of punishing them
á
Economic
efficiency counts everyone, so
á
A fine is
costless--what the criminal loses the state gets
á
Execution
is expensive--it costs one life (to the criminal) and we don't get a life
á
Imprisonment
is even more expensive, relative to the amount of deterrence we get
¬ The criminal loses his freedom and
¬ We have to pay for the jail
¤ Costs of trials, administration,
etc.--the hangman's wages.
¯ Suppose expected punishment is less than
damage done
¤ The marginal crime--the one that is not
quite deterred
¤ Does $1000 worth of damage, benefits the
criminal by $900
¤ And happens--because the expected
punishment is $899.
¯ We could raise expected punishment a
little and deter that crime
¤ Thus saving $100 net cost, but É
¤ Increasing expected punishment might
increase the cost of detterence by $200
¤ In which case we are better off not doing
it.
¤ On the other hand
¯ The cost of deterring one more crime
might be negative!
¤ Because if we deter a crime we don't have
to punish it
¤ And punishment (and apprehension and É)
is expensive
¯ So if the crime rate is very elastic
¤ A little extra punishment gets a big
reduction in crime rate
¤ We want expected punishment more than
damage done
¤ Because deterring crimes that are
(slightly) efficient saves us more in enforcement costs than it loses us in
gains from efficient crimes
¯ If the crime rate is very inelastic, on
the other hand
¤ We want expected punishment less than
damage done, because
¤ Crimes that are only slightly inefficient
¤ Cost more to deter than deterring them is
worth
¯ Which might explain why we don't deter
all murders, even if
¤ We think they are all inefficient crimes
¤ Or at least, we can't all agree on which
people the world is better off without.
v Law enforcement
¯ Police are rational too, and respond to
incentives
¯ Consider civil forfeiture
¤ Property used in the commission of a
crime--say a house where marijuana was sold--forfeits to the state
á
And since
it is civil, the standard of proof is only "preponderance of the
evidence."
á
No need to
prove the owner of the house guilty of anything.
¤ If the property goes to the police
department that seizes it, how are its incentives affected?
¤ Historical evidence (Bruce Benson
article)
á
Many states
had forfeiture statutes that turned over the property to some agency other than
law enforcement, such as education funds
á
Federal
forfeiture statute passed. Not much effect—most law enforcement is state
and local
á
Federal
statute amended
¬ Federal law enforcement could
"adopt" a state or local seizure
¬ Share the proceeds with the police
department responsible
¬ Thus evading the state rules on where the
money went
á
Law
enforcement resources shifted towards the War on Drugs
¤ My experience: Conference on money
laundering in the Pacific Rim
á
One series
of talks described a particular operation
á
Which
brought in, if my memory is correct, more than a hundred million dollars
¯ This suggests the answer to an
interesting puzzle:
¤ Why we use inefficient punishments
á
Fines are
more efficient than imprisonment or execution
á
Execution
with organs forfeiting for transplant is too
á
Why not a
system designed to squeeze all it can out of convicted criminals?
¤ Perhaps because such a system creates an
incentive to convict people
á
Larry Niven
story about organ forfeiture
á
African
colonialist version
á
Mencken's
American version
á
Modern
concern with punitive damages, class actions, É
v Another puzzle for you to think about
¯ Why do we have both criminal law and tort
law to do the same thing?
¤ Impose costs on people who do things that
hurt others
¤ One system with private prosecution, one
public
¯ Would a pure criminal or pure tort system
do the job?
¤ Iceland managed for over 300 years with
pure tort
¤ China for much longer with pure criminal
¯ Why do we treat certain things as torts,
certain things as crimes?
¯ Why do we have one set of legal rules for
tort, another for crimes
¤ Preponderance of the evidence vs beyond a
reasonable doubt
¤ Intent not required for torts, is
required for crimes
¤ É
v In choosing how to present and look at
data, there are two related issues
¯ How to actually learn things about the
data
¯ How to convince other people of things
you want them to believe
¤ Both how to do it, and É
¤ How not to be a victim of people doing it
v Descriptive Statistics I: Mean vs median
¯ Summing up data
¤ Mean aka average
¤ Median
¯ Suppose you want to know about the
average, cubical, cardboard box
¤ We have five boxes
¤ 1Õx1Õx1Õ, 2Õx2Õx2Õ, É 5Õx5Õx5Õ
¤ What are the average height, width, and
depth of a box?
¤ What is the average volume of a box?
¤ Do they correspond?
¤ What if we use median?
¯ The mean
¤ depends on how we measure the variable,
the median does not
¤ is sensitive to large outliers, the
median does not
¤ The median ignores how far anything is
above or below the median
¯ Consider two income distributions
¤ A: $5000, $5001, $5002
¤ B: $4999, $5000, $10,000
¤ Which has the higher median? Mean? Which
measure is more interesting?
¯ On the other hand—suppose you
believe some of your numbers are wrong
¤ In case B, a typo converts $10,000 to $100,000
¤ Messes up the mean, doesnÕt affect the
median
¯ Or suppose you have ordering, but no
natural quantitative measure
¤ Comparing chess players, say
¤ You could use their rating, average it,
but É
¤ Which player is average then depends on
just how ratings are calculated
¤ Which player is median only depends on
the rating getting the order right
v Histogram
¯ Visual portrayal of frequency: the idea
¯ Again consider my cubic boxes
¤ This time lots of them—1Õ,
1.1Õ,1.2Õ É
¤ With different numbers of different
sizes, so a histogram might be interesting. But É
¤ If we do it by edge size, we ask how many
boxes of sizes 1Õ-1.5Õ, 1.5Õ-2Õ, etc. Get a histgram
¤ If we do it by volume we ask how man 1
cubic foot to 2 cubic feet, 2 to 3, É.
¤ How do the results differ?
á
Increasing
from 1Õ to 1.5Õ increases volume from 1 cubic foot to 27/8=3.375
á
From 1.5 to
2 increases volume from 3.375 to 8.
á
So the
relative sizes of the intervals is different for the different methods
á
Making the
pattern of the frequency distribution different.
á
If it were
uniform the first way, it would look like an increasing frequency the second
way
¯ On the other hand É
¤ If there are lots and lots of 3Õ boxes
relative to everything else
¤ That will show up as a spike either way
¤ So histogram is useful for spotting that
kind of pattern
á
Is it
double peaked
á
My usual
student evaluations—tells me something, not clear what
á
Is it
assymetrical? Depends in part on how you define your variable. Boxes.
á
What would
you expect a U.S. income distribution to look like? Why?
¤ But somewhat ambiguous for Òfrequency is
increasingÓ or decreasing pattern
¤ And a clever lawyer could take advantage
of that.
v Other ways of fooling or being fooled
¯ Have the vertical axis start well above
zero, to magnify changes
¯ U.S. vs Japanese CO2 growth—real
example
v Dispersion
¯ The mean or median does not tell us how
wide the spread is—which might matter
¯ The usual definition is the standard
deviation
¤ Defined as the square root of the average
squared deviation!
¤ One reason to do it that way is that the
average of the deviation is É.?
¤ Squaring means that both negative and
positive deviations increase the average
¤ Other reasons we wonÕt go into
¯ ChebychevÕs rule—for any
distribution, normal or not
¤ At least 75% of the points are within 2
standard deviations of the mean
¤ At least 89% within three standard
deviations
¤ Can you see why this has to be true?
v Normal distribution
¯ A particular family of distributions
(Òbell curve)
¯ Where once you know the mean and the
standard deviation you know the distribution
¯ Which many real world distributions approximate
¯ And which has characteristics that are
known and useful
¯ About 68% within one stdev, 95% within
two, 99.7% within three
¯ If you know the mean IQ is 100 and the
stdev is 15, just how special is your IQ 150 kid?
¯ Z score table is the continuous version
of that rule
¤ Z score is the number of standard
deviations from the mean.
¤ Table tells you how likely it is that the
Z score is that high or higher
v Digression:
¯ How likely is it that a bridge deal will
be 13 spades to one player, 13 hearts to another, É?
¤ How about any other deal?
¤ So why, in the first case, do we conclude
that someone stacked the deck?
¯ Suppose you a coin is fair
¯ Flip it 100 times. Heads 53 times. What
question do you ask?
¤ How likely is it that a fair coin will
come up 53/47? Not very. Coin must not be fair?
¤ But you get the same answer for 52/48, É
indeed any single outcome rather unlikely
¤ Ask instead, how likely is it that the
evidence against a fair coin is at least this strong, i.e.
á
At least 53
heads or at least 53 tails
á
Pretty
likely
¤ That is a two tailed test. The null
hypothesis is a fair coin, and if unfair you donÕt know which way
¤ If you somehow knew the coin was either
fair or weighted towards heads, use a one tailed test—how likely is it
that I will get at least 53 heads out of 100.
v Sample statistics vs population
statistics
¯ You have statistics, such as mean or
standard deviation, for your sample
¯ You want to estimate the statistics for
the larger group the sample is drawn from
¯ Consider standard deviation—which I
think the book gets wrong
¤ Suppose I have a sample of one
á
Population
is the classroom
á
We want to
know the distribution of heights
á
I happen to
know my height 5Õ 3.5Ó
¤ What is the mean of that sample?
¤ What is the standard deviation?
¤ Do we conclude that the population has
that mean with that standard deviation?
¯ Rule turns out to be that you estimate
the standard deviation of the population dividing by N-1 not N
4/6/06
v Review
¯ Different ways of summarizing a bunch of
data
¤ Mean vs median
¤ Histogram
¤ Standard deviation: Chebychev's rule
¤ Some ways may be deceptive, deliberately
or not
¯ Normal Distributions
¤ Bell curve shape
¤ All normal distributions are the same
except for two parameters
á
Mean—where
the center is
á
Standard
Deviation—how much it is stretched out
¤ So if you know how many standard deviations
from the mean an observation is
á
You can
look up on a table how likely it is to be at least that far from the mean
á
Which
information might be used in two ways
¬ If you are confident about your mean and
standard deviation, tells how atypical this sample is
¬ Alternatively, if your observation is
very unlikely, perhaps you are wrong about the mean and/or standard deviation
¬ Warren Buffet as a five sigma event
¬ Which will get us into hypothesis testing
¤ Lots of distributions aren't normal, but
É
á
If we are
looking at the average of a sample from a distribution
á
The Central
Limit Theorem tells us that the distribution of averages
á
Approaches
a normal distribution as the size of the sample increases
¯ Practically anything is wildly unlikely
¤ Any particular series of heads and tails
with coins, any particular bridge deal
¤ But "some sequence that ends up
50/50" is more likely than a particular sequence
¤ And if your suspicion is a weighted coin,
the question isn't
á
How likely
is this result with a fair coin (very unlikely, whatever the result) but
á
How likely
is a result at least this far from the mean with a fair coin
á
Since any
result far from the mean inclines you to reject the "fair coin"
hypothesis
á
And your
real question is "how likely am I to reject that hypothesis if it is
true?"
¯ Sample statistics vs population
parameters.
¤ You want to know average height and
standard deviation for law school students
¤ You measure the students in one class.
¤ What does that tell you about all the
students at SCU? In the Country? The World?
v Hypothesis testing
¯ The basic logic of confidence results
¯ You have a null hypothesis—this
coin is fair
¯ You have a sample—say the result of
flipping the coin ten times
¯ You want to decide whether the null
hypothesis is true
¤ In the background there is an alternative
hypothesis
¤ Which is relevant to how you test the
null hypothesis
¤ For instance—this coin is not fair,
but I don't know in which direction
¯ You ask: If the null hypothesis is true,
how likely is a result at least this far from what it predicts in the direction
the alternative predicts
¤ For example, if the coin is fair
¤ How likely is it that the result of my
experiment would be this far from 50/50?
¯ Suppose the answer is that if the coin is
fair, the chance of being this far off 50/50 is less than .05 (i.e. 5%)
¯ You can then say that the null hypothesis
is rejected at the .05 level
¯ Does this mean that
¤ the null hypothesis has less than .05
chance of being true?
¤ The alternative hypothesis has at least
95% chance of being true?
¤ To see why neither is correct, consider a
simple experiment:
¯ Experiment
¤ Null hypothesis: Coin in my pocket is an
ordinary fair coin
¤ Alternative—coin is double headed
¤ Flip the coin once—comes up heads
¤ Probability of a result that far in that
direction is .5
¤ Do we conclude that the probability that
the coin is double headed is .5?
¯ What's wrong with the (common)
misunderstanding
¤ .05 is the probability of our result if
the null hypothesis is true
¤ not the probability it is true if we get
that result
¤ is, as in my example, the null hypothesis
is initially much more likely than the alternative—very few random coins
are double headed
¤ then the combined chance that the coin is
fair and it came up heads (about .5)
¤ is much higher than the combined chance
that it is double headed (say one in a million) and came up heads (one in
one—if it's double headed)
¤ so after one flip—even after three
or four all heads—we still think the odds are it is a fair coin
¯ so a confidence interval is a simple way
of reporting how strong this piece of evidence against the null hypothesis is,
but not how likely the null hypothesis is to be true
¯ analogously, it might be that a witness
identification has only one chance in four of being wrong by chance
¯ but if you also have an absolutely solid
alibi, you still get acquitted
¯ "Statistically significant"
doesn't mean "important" it means "unlikely to occur by chance
¤ suppose a take a random coin and flip it
10,000 times
¤ the result will prove it isn't a fair
coin to a very high level of significance
¤ even if it is "unfair" only by
.501 vs .499 probability
v The validity problem vs sampling error vs
bias
¯ One problem with samples, which we have
been discussing, is sampling error
¤ When you select ten students,
¤ by chance they might be taller or shorter
than average
¯ Another is bias.
¤ If you are measuring age, not height, and
select students in this class
¤ Since it isn't taken by first years
¤ Your sample is biased towards older
studentsÉalthough
¤ There may be a bias the other way because
it isn't an evening class.
¤ Famous example—telephone pole that
showed Dewey would win
¯ A third is validity
¤ If you test age by asking people their
age when their friends are around
¤ In some populations people refer to
exaggerate their age
¤ In others to make it look smaller
¤ Similarly for asking about adultery in
the presence of a spouse
¤ Or drug use in any context where the
questioner knows the name of the respondent
¯ Note that bias and invalidity may be
either accidental or deliberate
v Specification Search problem
¯ How to make a fortune giving investment
advice
¯ How to prove that Diet Coke causes cancer
¯ How you get the problem without even
trying
¯ But nowadays, there are programs designed
to try
¯ Which is one reason why you should put
your data on the web and let other people play with it
4/11/06
v Using statistics
¯ Theory: The average height of LS students
is 5'10"
¤ You measure one person's height. It is
5'7". How good evidence is this for or against?
á
With no
additional information, you can't say
á
Because you
have no idea what the standard deviation is
á
But if the
theory include a standard deviation of 2" É how would you analyze the
data?
¤ Suppose the theory is only of the mean
á
How do you
go about estimating the standard deviation?
á
By
measuring several students, calculating the standard deviation of the sample
á
It will be,
on average, a little lower than the standard deviation of the population. Why?
á
You adjust
for that by dividing by N-1 instead of N, get an estimate for the population
á
Suppose the
heights were 6' 2", 6', 5'7", 5'10"
v Correlation is not causation
¯ More generally, facts don't speak for
themselves.
¯ Consider Peltzman's analysis of the
effect of requiring seatbelts (and some other things)
¤ Before the requirement, say, 40% of
crashes were fatal, after 30%
¤ After the requirement, 10,000
crashes/year.
¤ So the requirement saved 1000 lives/year
¤ What is the hidden assumption in this
argument?
¤ Why might it be wrong?
¯ Suppose you want to know whether the
death penalty deters murder
¤ How might you find out?
¤ Compare murder rates in states with and
without? What is wrong with that approach?
¤ Is there a better way?
v Statistics in the Law
¯ Are there enough blacks on the jury that
convicted your black client?
¤ What is the probability, on a random draw
from the population, of this many blacks or fewer?
¤ If you are the prosecutor, how do you
respond?
¯ Does a firm discriminate against women in
wages? Promotion?
¤ Look at average wage—lower.
¤ Look at average position—lower
¤ As the defense attorney, what issues
might you raise?
¤ As the plaintiff's attorney, how might
you answer them?
¯ Does this drug have dangerous side
effects?
¤ Of people taking this heart medication
who had heart attacks, what % died from them? 45%
¤ Of people not taking it, only 30%
¤ Sample size is 1000 people taking the
drug who had heart attacks, 1000 not taking it who had heart attacks
¤ You are the defendant's attorney. What
questions might you ask?
¯ Others?
v Exploratory Statistics: Looking for
patterns
¯ You are an enterprising torts attorney,
wondering if electric transmission lines do anything actionable
¯ How might you look for effects?
¤ Start with a database showing rates for
cancer and other things by county
¤ And a map showing transmission lines
¤ And access to the U.S. census data
á
Which give
you data at the individual level
á
Including
place of residence
á
Age, race,
gender
á
Possibly
cause of death for those that recently died?
v Summary: More conceptual than mechanical
¯ Descriptive statistics
¤ You have a bunch of data and want to
describe them
¤ Mean or median—advantages of each
¤ Some measure of spread—typically
standard deviation
¤ Plot of the distribution—a
histogram
¤ How you present the data can make a
difference
á
Which is
important both in fooling people and
á
Not getting
fooled
¤ Note that not all data are quantitative
á
You might
have categories: race, gender, nationality
¬ Note that race could be
quantitative—percent of ancestry
¬ But in the available figures usually
isn't
á
You might
have an ordinal rather than cardinal ranking
¬ Which chess player is better than which,
not by how much
¬ Individual preferences
á
As long as
there is a ranking you have a median, but not a mean
á
And for categories
you don't even have that.
¯ Hypothesis testing
¤ We conjecture that something is true:
This coin is fair
¤ We do an experiment
¤ What do the results tell us about whether
our conjecture is true?
¤ We can calculate how likely the result
(HHTHTTHTTH) is if the coin is fair
á
The answer
is (1/2)10 = 1/1024 = aprox .001 !
á
Same figure
for any other series of ten results, however
á
But add
them all together and we get a probability of 1
¤ We need some alternative hypothesis to
tell us which results count as more or less evidence against
¤ We can then ask "how likely is it
that the evidence will be at least this strongly against
¤ Alternative hypothesis: coin is weighted
towards heads.
á
So the more
heads, the stronger the evidence against the fair coin hypothesis
á
Our
experiment gave five out of ten.
¬ Probability of at least that many heads
if fair coin is É?
¬ Very very weak evidence against the
conjecture, but É
¬ Better than if it came out 7 tails and 3
heads.
¤ Alternative hypothesis: unfair coin,
direction unknown
á
Suppose it
comes up 7 heads, 3 tails
á
Ask the
combined likelihood of 7/3, 8/2, 9/1, 10/0. If the coin is fair
á
Double that
to allow for 7 tails, 3 heads etc.
á
Since they
are just as good evidence against
á
Suppose the
total is .2
á
We can then
say that there is a .2 probability that a fair coin would produce evidence this
good that it isn't fair.
¤ Errors of the first and second sort
á
If we take
.2 as adequate evidence (unlikely—but perhaps for a tort suit?)
á
We will
judge 1/5 of all fair coins to be unfair. Type one error
á
What fraction
of unfair coins will we judge to be fair? Type two error
¬ That depends how unfair the coins are
¬ If they are double headed, none of them.
¬ If they are .501/.499, about 4/5 of them
¤ My example was with coin tossing
á
Probabilities
are easy to calculate, if you know probability theory
á
A more
common example involves taking a sample, figuring out whether it is strong
evidence about some assertion about the distribution it is from
á
Which
usually involves setting it up so the distribution is approximately normal
á
Which we do
by using the central limit theorem
á
Heights of
students are nowhere close to a normal distribution
á
But the
average of the heights of 100 students comes quite close to one
á
Brief
explanation of what a probability distribution is.
¯ Inferential statistics
¤ We look at a sample, try to estimate the
characteristics of the population it is drawn from
¤ First problem is getting a fair sample:
Critique each
á
Gun
control, letters from constituents
á
For
percentage of motorists who drive drunk, breathalyzer test to the driver of
every twentieth car passing a checkpoint
á
For value
of environment, survey shoppers at a Wal-mart
á
Jury pool,
home phone numbers between noon and 5 on weekdays
á
To
determine % of rotten apples in a crate, check the ones at the top.
¤ If you know you can't get a fair sample
á
how might
you adjust?
á
What is the
risk in doing so?
¤ Second problem is measuring what you want
á
Law school
questionnaire problem
á
How you
frame a question matters to perceptions
á
And the
self-interest of the person answering matters to incentives
¤ The usual report of error margins ignores
both of these problems.
¤ It asks
á
If our
estimate of the standard deviation is correct
á
How likely
is it that the mean of our sample differs from the mean of the population by
how much?
á
If you are
using .05, you are asking
á
What is the
deviation such that the probability our mean is farther from the population
mean than that is .05?
á
Remembering
that the distribution of means is close to normal.
v Multivariate Statistics 1: Bivariate
¯ Each item (person, country, state, year)
has two characteristics
¤ How are they related to each other?
¤ Why?
¯ Descriptive approach: Scatterplot
¤ Approximate linear relationship. But note
¤ The plot might show you more complicated
things, that calculating the correlation coefficient would miss.
¤ The first one you would get a positive
correlation coefficient—what would you miss?
¤ The second one, near zero correlation.
But É
¤ Very useful if you are looking for
patterns
¯ Correlation coefficient: Numerical
description of the relationship
¤ Summary
á
Value from
–1 to 1
á
Sign tells
you whether larger than average values of one variable imply larger than
average values of the other (+) or smaller (-)
á
The
magnitude tells you how perfect the relation is, not the slope.
á
á
Which of
these has the higher correlation coefficient
á
This is the
same point I made earlier about significance
¬ Statistically significant means we are
sure the effect is there
¯ It says nothing about how large it is
¯ 550 heads/450 tails is much more
significant evidence of unfairness than
¯ 3 heads/1 tail
¤ Mathematical definition
á
For each
value of the first variable, calculate how many standard deviations it is from
the mean--+ if greater than mean, - if less
á
For each
observation (person, state, É) multiply that figure for the first variable
times that figure for the second
á
Average
over all observations
¬ (except you divide by n-1 instead of by n
in averaging)
¬ for the same reason we did it
earlier—sample slightly exaggerates the correlation for the population.
¬ I think
¤ Why this makes (some) sense
á
If above
average values of X occur for the same observation as above average values of
Y, the product is positive
á
If below go
with below, the product is still positive—negative times negative is
positive
á
So if the
two variables move together, get a positive correlation coefficient
á
If they
move in opposite directions, above average of one go with below average of the
other, so + times – or – times +, which gives negative
á
Average
lots of negative numbers, get a negative correlation coefficient
¯ Correlation is not (necessarily)
causation
¤ The result might be entirely due to some
third variable that causes both
á
Driving an
expensive car has a negligible effect on life expectancy—probably
negative if itÕs a sports car
á
But
probably correlates with life expectancy. Why?
á
Height has
little effect on having children, but É
á
Number of
children one has born is probably negatively correlated with height of adults
á
Because?
¤ Or it might be partly due to such third
factors, so you don't know how strong the causal effect is
¤ And third factors might push the other
way, reducing, eliminating, or reversing the causation
á
Death
penalty and murder rates
á
If factors
that make murder rates high make death penalty more likely
á
Either
because high murder rates create pressure for death penalty
á
Or because
the social factors that make people more willing to kill illegally also make
them more willing to kill legally.
á
You might
have a positive correlation masking a negative causation
¯ And causation is not necessarily
correlation either
v Causation, correlation, and prediction
¯ Correlation can be used to predict
¯ "if the state has a death penalty,
it probably has a high murder rate"
¤ doesn't depend on which causes which
¤ or whether there is a third factor
causing both
¯ but if you have the causality wrong, you might
get the prediction wrong
¤ because you are missing other relevant
evidence
¤ taller adults are less likely to have
born children than shorter
¤ but taller women aren't.
v Review: Bivariate statistics
¯ We have two characters, each associated
with individuals in a population
¤ Height and weight of people
¤ Rainfall and average temperature of years
¤ Income and Lsat score
á Which could be parental income and
student LSAT score or
á Entering LSAT and later income as a
lawyer
¯ We want to know how the two are related
¤ When height is above average, is weight
above average? (Probably)
¤ Do cool years have more rainfall?
¯ Correlation coefficient is a measure of
how consistently
¤ When one variable is above its average,
the other is above its (positive correlation)
¤ Or when one is above, the other is below
(negative)
¤ 1 is perfect correlation--if you plot
them they are on a straight line, slopes up
¤ -1 is perfect negative
correlation--straight line, slopes down
¤ 0 is no correlation
¯ It measures the consistency of the
effect, not the strength
¤ If every inch above average height
results in .001 lbs above average weight, the effect of height on weight is
tiny
¤ But very consistent
¤ So correclation coefficienty is 1
¤ An example of the distinction between the
statistical significance of a relation and its size.
¯ Scatter diagram might show patterns even
if there were no correlation
¯ Correlation does not necessarily
correspond to causation
¤ If probability of death penalty
correlates positively with murder rate
¤ Might be because a death penalty causes
people to commit murder
¤ Might be because high murder rates create
political pressure for a death penalty
¤ Might be because certain cultural
characteristics cause both death penalty and high murder rate
¯ Correlation does make possible
(imperfect) prediction
¤ If A correlates positively with B, then
¤ If you observe an unusually high value of
A, you can predict an unusually high value of B
¤ And be right more often than you would be
by chance. But É
¯ If you get the causation wrong, you might
be missing other relevant evidence
¤ Height correlates negatively with number
of children an adult has born
¤ But not for women
v linear regression
¯ instead of measuring how close to a line
the points come (correlation coefficient)
¯ you try to estimate the line they come
closest to
¯ which requires some definition of
"close."
¤ You want to count both being too high and
too low as errors
¤ So the difference between point and line
wouldn't work
¤ Instead use the square of the
difference—positive each way
¤ Find the line that minimizes the summed
square deviation.
¯ Unlike the correlation coefficient, this
one measures the size of the effect
¯ y= A+Bx
¤ A is the intercept—where the line
crosses the vertical axis
¤ B is the slope—how much the line
goes up for each unit it goes out
¤
v Goodness of fit
¯ By convention, X (horizontal) is the
independent variable, Y (vertical) the dependent: Y=A + BX
¯ Simplest "prediction" is that Y
always equals its average value
¯ How much of the departure from that does
the regression explain?
¯
¯
¯ TSS
is the sum of squared residuals from the average
¯
¯ So R2 is a measure of how much
of the variance about the mean is explained by the regression line. Total
variation minus variation unexplained by regression divided by total variation
¯ So R2 of 0 means the
regression line does no better than just assigning the mean value to every
point
¯ R2 of 1 means the regression
explains all of the variance.
¯ Like correlation, this is a measure of
goodness of fit
¤ In fact, R2 is the square
¤ of the correlation coefficient r
¯ And B, the slope, is a measure of the
strength of the relationship.
v Residuals
¯ If you plot the residuals from a
regression--distance above or below the line
¯ It will show you which points don't fit
the pattern
¯ In exploratory statistics, you might want
to color points in ways reflecting other characteristics
¤ Men/women
¤ Blacks/whites
¤ Northern states/Southern states
¤ CEO's relatives/non-relatives
¤ And see if any such coloring explained
the pattern
¯ In the book's example, Mary Starchway is
both an outlier and an influential observation
¤ Outlier because her wage is much higher
than anybody else's
¤ Influential observation because she is
far off the experience/wage regression line
¤ Does the first necessarily imply the
second?
v Limitation of linear regression
¯ There might be a close relationship that
isn't linear
¯
¯ there are procedure analogous to linear
regression for dealing with the first case
¤ Instead of plotting Y=A+BX you might plot
¤ Y=A+BX+CX2 for example
¤ Giving something like that if B<0 and
C>0
¯ The second case strongly suggests that we
need more than two variables
¤ Y is determined by X, and also by
¤ Whatever it is that distinguishes the two
lines
v Multiple Regression
¯ Suppose you believe that the murder rate
depends on
¤ The death penalty
¤ The fraction of the population that is
males 18-26
¤ This year's unemployment rate
¯ You could express that as M=a+b1D+b2F+b3U
¤ Here M is the murder rate, by state
¤ D is the probability that a murderer will
get the death penalty, by state
¤ F is the fraction of the state population
that is male 18-26
¤ U is the state's unemployment rate
¯ The regression could be cross section
¤ All states
¤ In one year
¯ Or longitudinal
¤ One state
¤ In a series of years
¯ Or both
¯ And lots of more complicated versions are
possible, for instance
¤ Perhaps the murder rate depends on the
square of D, or
¤ Perhaps D should be treated as a binary
variable instead of continuous
á States with death penalty, D=1
á States without, D=0
¤ Derhaps murder rate in one year depends
on current unemployment rate but last year's death penalty probability
á In which case you use current variables
for everything else
á But a lagged variable for D
á Meaning that the value for NY in 1990 is
the death penalty probability for NY in 1989
¯ In all of these cases, you are minimizing
the sum of squared deviation from the regression's predictions
¤ Define as the value of
M predicted by the regression
¤
i= a+b1Di +b2Mi +b3Ui
¤ Here
i labels the particular observation (state and year in this example)
¤ We
are looking for the values of a,
b1, b2 and b3 that minimize
¤ The
sum of squared residuals, i.e. the
sum of squared values of
¤ (Mi-i)
¤ summed over all i, which is to say over
all states, or years, or É
4/20/06
v Review.
¤ What is a multiple regression
á We have a dependent variable
¬ Murder rate
¬ Bar passage rate
¬ Cancer rate
á We think it is partly determined by other
things on which we have data, which will be our independent variables
¬ For the murder rate that might include
¯ Fraction of the population male 18-26
¯ Urban population
¯ Ratio of executions to murders
¬ For Bar passage rate, entering GPA, first
year grades, É
¬ For cancer rate, smoking, age, É
á Our data show us, for each observation,
the value of the dependent and independent variables
¬ For each state, murder rate,
demographics, É
¬ For each graduating law student, did or
didn't pass bar first time, entering GPA, É
¬ For each person in our sample, did or did
not get cancer, smoker/nonsmoker, É
á We are using the data to figure out how
the independent variables affect the dependent variable
á By fitting the data to an equation of the
form:
á
yi=a+b1x1i+b2x2i+É
á
Finding the
values of a, b1, b2, É that
minimize the summed square error—that get the predicted values for yi
as close as possible to the actual values, given the actual values of x1i,
x2i, É
¤ We hope to learn two things from this
á How does each independent variable affect
the dependent variable
á How sure are we of the
effect—whether it exists and how big it is.
v Interpreting multiple regression results
¯ As usual, we distinguish between
statistical significance and size
¤ R2 tells you how much of the
variance has been explained, but É
¤ If you use a lot of variables, it isn't
surprising that you can explain a lot of variance.
¤ "Give me ten parameters and I'll fit
the skyline of New York."
¯ The regression coefficient tells you the
size of the effect, not how sure you are it is there
¤ Suppose we define D more precisely as the
probability that a convicted murderer will be executed
¤ And M as the number of murders/100,000
population
¤ If b1=10, that means that an
increase of .1 in the probability a convicted murderer will be executed
¤ Leads to one more murder per 100,000 of
population
¤ For "how sure you are there"
you use a confidence measure
á Typically a t statistic, which measures
á The size of a coefficient, say b1,
relative to its "standard error"
á Where standard error is an estimate of
the standard deviation of the coefficient calculated from a sample of a given
size
á "If we did this regression many
times, drawing our sample at random from the same population, how much would b1
vary?"
á The more standard errors the coefficient
is from zero, the more confident we are that it is not zero, i.e. that the
independent variable, say the probability of execution, affects the dependent
variable, the murder rate.
á To turn this into a confidence interval,
you use the t table, as described in the book.
¤ Consider the Bazemore et. al. result
á The size of the coefficient on race was
$394.51—not very large
á Standard error was 137.67, so
t=coefficient/st error=2.87, lots of df
á Makes it to the .01 level
á If you are the defense, how do you
explain this away?
¬ How many regressions did they do?
¯ Hispanic? Gender? Quadratic instead of
linear?
¯ If they tried a hundred different
versions, not so impressive
¬ Are there hidden variables that correlate
with race and something relevant to salary? For example,
¯ where are those masters degrees from
¯ in what fields?
¯ More generally, what can go wrong in
multiple regressions
¤ Omitting variables: On purpose,
accidentally, or because you don't have the data
á If you omit a variable that is an
important influence on what you are explaining—wage in wage
discrimination lawsuits, for instance
¬ If it doesn't correlate with the
variables you include, the result is only to lower the R2—you aren't explaining
as much of the variation in wages as if you included it. But É
¬ If it does correlate, you end up
assigning the effect of the missing variable to the variables that correlate
with it
¯ If height affects how good you are at
basketball
¯ And correlates with weight
¯ And you do a regression including weight
but not height
¯ You conclude that heavier people are
better players.
á So any confidence result however strong
can be explained away
¬ If there is a plausible variable left out
that
¬ Has a large effect on the dependent
variable
¬ And correlates with the variable you got
the confidence result for
á If that plausible variable can be
measured, do, include it, see what happens.
¤ Specification search
á You do a lot of different regressions,
looking for the important variables
á And the right form for them to appear in
¬ If effect gets stronger and stronger with
increasing size, perhaps quadratic
¬ If weaker and weaker, perhaps logarithmic
á If you roll the dice enough times, you
will get a double six.
¤ Multicollinearity
á Suppose two of your included variables
correlate closely with each other
á Neither appears significant—because
most of its effect could be due to the other
¤ Causation in both directions
á Their police example
á Their expenditure on education vs test
score example
á A better one would be money spent on
medical care vs health
¤ Correlated errors
á Suppose we are studying the death
penalty, using cross sectional data—i.e. data by state.
á Dependent variable is the murder rate.
á One independent variable is the ratio of
executions to murders
á They are not really related, as it
happens, but É
á Different states differ in how they
measure the murder rate
¬ Some are very willing to interpret a
death as a murder
¬ Others are very skeptical
¯ If it might be suicide, it is suicide
¯ If it might be accident, it is accident
á What will the regression show?
¤ There are lots of fancier statistical
procedures designed to deal with such problems
v Our law school problem
¯ How would you set up a multiple
regression to figure out how to improve the bar passage rate?
¯ What variables would you include?
¯ In what form would they appear?
¯ What problems do you see with omitted
variables?
¯ With multicollearity?
¯ With causation in the reverse direction?
Questions
from the book's supplementary material
Here
is the least squares regression
equation relating Ames graduatesÕ yearly income in dollars 10 years past
graduation to Ames GPA. (Scale F=0
to A=4).
Income
= 1,208,000 + 25 GPA
Does
this equation offer a good reason for greedy students to work hard to improve
their GPA?
The
Boston Post reports, ÒResearchers at Ames Law School have discovered that
law school GPA has a correlation
of minus .35 with the number of alcoholic drinks per month that a student
imbibes. The authors of the study,
Professors Stern and Meaney speculate that alcohol has a negative effect on
higher brain functions of the type required to do well in law school examinations.Ó
Comment insightfully based on your textbook
readings about correlation.
The
following question is in a survey – what is the last digit of your
telephone number?
What
would the histogram of responses look like?
Suppose
instead it was "the last digit of the month of your birth?"
v Course Evaluations
Review Of the Semester
v Decision Theory
¯ A way of formalizing how you make decisions
¯ In order to help you do it
¯ Via a decision diagram
¤ You make a choice
¤ Something happens
¤ You make another choice
¤ É
¤ some choices have costs
¤ at the end you get an outcome of some value to you
¤ what sequence of choices, on average, gives you the best result?
á Best is most simply defined as maximum expected value, but ..
á In some cases that's wrong—risk aversion
á You might much prefer a certainty of $100,000 gain to a .1 probability of $1,000,000
¯ The formal approach requires you to know all of
¤ the alternatives, probabilities, costs and payoffs
¤ So trying to set up the diagram forces you to think about what they are
¤ And try to estimate, using what you know, what your client knows, whatever else you can find out.
v Game Theory
¯ Strategic behavior
¤ In decision theory, there was only one actor
¤ Now there are at least two, with their own objectives
¤ And each of them is watching the others, and conditioning his choices on theirs.
¯ We would like to know
¤ Given a game structure, how should you play, and É
¤ What will happen, given how everyone plays
¤ One perspective is of a player, the other that of someone analyzing the result games will produce—perhaps because he is creating games
á For instance, someone making laws
á Or writing a contract
á Or structuring a business
¯ One approach is subgame perfect equilibrium
¤ Which is essentially a two person version of a decision theory diagram
¤ Without commitment strategies
¤ Meaning that when Anne gets to her final choice, she always makes the decision that is best for her
á And Bill knows she will, so can make his previous choice taking Anne's final action for granted
á And doing what is best for him, given that
á And Anne knows that, so in her choice before Bill's final choice É
¤ With commitment strategies, it might make sense for Anne to "tie her hands"
á Set things up so if Bill makes the choice she doesn't want him to make
á She will be committed to a choice he doesn't want her to make
á Even though it's bad for her too
á Because that way Bill won't make the choice she doesn't want him to.
¯ For an example where commitment strategies are important, consider bilateral monopoly
¤ AKA bargaining
á We both benefit if the transaction goes through, but É
á The terms of the transaction determine how much of the benefit each of us gets.
á Union/Management bargaining, diplomacy, parent/child, buying a house É
¤ You can try to get better turns either by
á Somehow committing yourself not to accept otherwise, or É
á By misleading the other party about what terms it is in your interest to accept
¤ The risk of either is bargaining breakdown—nobody gets anything
¯ Another is to think of a game in terms of a strategy matrix
¤ Each of my strategies is a full description of what I will do
á Start by advancing my queen's pawn
á If you respond by É I will next do É
á All the way to the end of the game, for all possibilities
á Where, in order to make your responses less predictable, your strategy might include flipping a coin at some point and deciding accordingly.
¤ The matrix shows all my strategies, all yours, and the outcome for each of us given any pair of strategies—what I'm doing and what you are doing
¤ Von Neumann showed that in a two person fixed sum game described that way, there was always a "solution"
¤ Meaning a pair of strategies, each of which was best against the other.
¤ A simpler sort of solution, but one that may not exist, is a pair of dominant strategies
á Meaning that each is the best strategy whatever the other person does
á In Prisoner's Dilemma, confessing is the dominant strategy for each player
¯ One approach to a many player game is a Nash Equlibrium
¤ Meaning a set of strategies, one for each player, such that
¤ Each player's strategy is best for him, given what the others are doing
¤ Everyone driving on the right is a Nash equilibrium
¤ So is everyone driving on the left
¤ So is the prisoners, faced by a guard with one bullet, not rushing him.
¤ As these examples suggest, a Nash equilibrium need not be
á Either unique, or É
á Optimal for the players
¯ Moral Hazard: Doesn't belong in this chapter but that's where they put it
¤ If I bear only part of the cost of my action—because my factory is insured
á I have an inadequate incentive to take the action
á And won't take it if its cost is too close to the benefit, since then
á Its cost is more than my share of the benefit
¤ Solutions include
á Requiring certain actions (sprinklers in the factory)
á Only insuring partially, so that at least I take the really valuable precautions—the ones worth considerably more than they cost.
¯ Adverse Selection: The Market for Lemons
¤ If the seller knows the quality of what is being sold and the buyer doesn't
á Buyer offers a price based on his estimate of average quality
á At that price selling is much more attractive if your goods are of low quality
á So mostly low quality goods get sold, high don't
á And buyers adjust their offer down accordingly
á So all the lemons are sold, for lemon prices, and almost none of the creampuffs—which would also get lemon prices
¤ Life or health insurance raises a similar problem
á If customers know much more about the risk than the insurance company
á High risk customers find insurance a much better deal than low risk
á So the insurance company correctly concludes that if you buy you are probably high risk
á And prices accordingly.
¤ Solutions include
á Seller provides a guarantee—but that rises moral hazard problems
á Keep both parties ignorant—forbid genetic testing before buying insurance
á Make both parties well informed—let insurance company require genetic testing.
v Contracting
¯ Basic idea—design the contract to maximize total benefit, bargain over dividing it
¯ Maximize total benefit by getting the incentives right
¤ Minimizing costs due to things like moral hazard aka exteralities within the contract
á Try to set it up so that each party bears the costs that depend on his actions
á Which might include precautions, deciding whether to breach, É.
¤ And to minimize adverse selection problems
¤ Which means making each party bear the costs he is best informed about
á So A knows the risk that a strike will halt his production
á And B doesn't care, because the contract requires A to compensate him if it does
¤ But the ability to do this may be constrained by one party's limited ability to observe things
á Such as the quality of materials used to build a house
á Or whose fault something going wrong was
¯ And we ran through the logic of that in different contexts, such as
¤ Production contracts: Building a house
á How is the contractor paid, and É
á What choices does he get to make, what are restricted by contracts
¤ Service contracts
¤ Principle/Agent relations
¤ Joint Undertakings
¤ Sale or lease of property
¤ Loan
¯ Issues common to many of these are
¤ Incentives and observability
¤ Damages for breach
¤ Resolving conflicts, renegotiating
v Accounting
¯ A way of keeping track of what is happening in a firm or other organization
¤ Has to be rigid enough so that interested parties can't easily make things look good when they are not
¤ And yet reliable enough to generate useful information
¯ Rigidity requires simplifications, such as
¤ Cost to measure value most of the time
¤ Intangible assets such as goodwill ignored, unless purchased for a known price
¤ Almost all probabilities treated as one or zero
¯ Balance sheet: Photo of firm at an instant
¯ Income statement
¯ Cash flow statement
¯ T-Accounts
¤ Which show individual transactions, each in two places
á If you buy thing, decreases cash, increases assets
á Sell something, the other way
á If they don't balance—sell for more than book value—the difference
¬ Goes to income, and eventually to
¬ Firm equity
¤ And eventually feed into income and from there to balance sheet
¯ Matching principle: How to decide to what period an expenditure or outcome is allocated?
¯ Defining an entity—boundary lines between you and your business, law school and university.
¯ Using such information to figure out
¤ Is a firm really solvent
¤ How is it doing?
¤ Why? Emma Lathen: Accounting for Murder
v Finance
¯ Theory of the firm. Coase. Berle and Means. Adam Smith
¤ Relevant to legal issues, such as whether merger violates antitrust laws
¤ And how much discretion managers should have
¤ And what the limits are on the majority stockholders
¯ Debt/Equity question—how should a firm finance itself?
¯ Firm as a problem in agency theory.
¯ Time value of money: Present value calculations
¯ Efficient market theory
v Price Theory (aka microeconomics)
¯ Economics: An approach to behavior,
¤ starting with the rationality assumption
¤ potentially applicable to all behavior
¯ the coordination problem
¤ can be solved by top down hierarchy—you do that, he does that, the other guy É
¤ or by a decentralized system of private property and exchange
¤ a lot of legal issues are about how to make the second method work better.
¯ Perfect competition
¯ Demand and supply curves—their intersection gives price and quantity
¯ Monopoly
¤ What it is
¤ Why it happens
¤ How it is in the interest of a monopolist to behave
á Sell too little at too high a price
á And spend resources becoming a monopoly
¤ What, if anything, we can do about it.
¯ Externalities—one reason the decentralized solution doesn't work perfectly
¤ If you don't bear all the costs of your action, get all the benefits, the action that Moral hazard was another way of describing the same problem
¤ best serves your interests may not best serve ours.
¤ Possible solutions include
á Regulation—make make me do the right thing
á Pigouvian tax—force the internalization
¤ Coase's criticism of Pigou's analysis—externalities are really jointly produced
¤ Also, if no transaction cost, bargaining eliminates them
v Economic Analysis of Law
¯ Making sense of legal rules as systems of incentives
¤ Given these rules, how will people act in their own interest
¤ Is that the outcome we want?
¯ Property: Why it exists. Why some things are property and some are not. What's included in the bundle
¯ Torts: How do we get only "efficient torts"
¤ Meaning only torts that cost more to avoid than its worth
¤ Strict liability, negligence, ???
¤ Worry about incentives of both parties, and about what the court can or cannot know
¤ Really the externality problem again.
¯ Crime: How do you get only efficient crimes? How do you include the cost of catching and punishing criminals in your definition of "efficient crimes?"
v Fundamentals
of Statistics
¯ Descriptive
¯ Hypothesis
testing
¯ Deducing things
v Multivariate statistics