Course description
Welcome to a course which is meant to
be a first step in capturing and analysing chance, in quantifying
the unknown. A course which will provide for you the basics of
Probability Thoery, a course which can launch you towards the
study of Stochastic Processes, one of the most powerful
mathematical tools of our times. Stochastic processes are an
inexhaustibly useful in Differential Geometry, Differential
Equations, Functional Analysis, etc , but also outside mathematics
in computer science, engineering, economics, actuarial sciences,
biology, chemistry, phisics, gambling, everyday life, etc.
Stochastic processes are an infinite resource of scientific
research. This course will cover the definition and
properties of probability and conditional probability, the
independence of events, the analysis (expectation, variance,
moment generating function) of the most important types of
discrete and continuous random variables as well as the joint
analysis (joint and conditional density, conditional expectation
and variance) of a couple (X, Y) of random variables, the
independence of random variables. We'll end with the two major
convergence theorems in Probability Theory, which are the Law of
Large Numbers and the Central Limit Theorem. A detailed syllabus
is presented below after discussing the exams and the grading.
In the same time, this course
represents a preparation for the P probability exam. Although we
do not do special exercises strongly resembling in contents and
style to those from the P exam, this course represents the basis for the preparation of
the P exam. You are advised to start and practice on old P exams
even from the beginning of this course, in this way as soon as a
new concept will be introduced you will be able to apply it on P
exam style exercises and thus your preparation will be more
efficient.
Exams
Two midterm exams and a final exam will be given. The
duration of each midterm exam is 50 minutes, the duration of the
final exam is two hours. All exams are multiple choice.
No makeup exams will be given (except for absolutely exceptional
circumstances, in which
case you will need aproval of the Head of the Mathematics
Department).
Cheating will not be tolerated. If caught, you are forbidden to
continue the respective exam and punishment may range from a
score of zero in the respective exam to a failing grade in the
course with a referral to the University Disciplinary Commitee.
MIDTERM 1: PRACTICE
MIDTERM 1, 2013
MIDTERM
1 + SOLUTION, 2013
Date: Monday February 11, 2013
Time:
During class time (Try to come 10 min earlier than the beginning
of the class!)
Topics: Chapters 2 and 3 from Roussas book (The two definitions of
probability, properties of probability and applications, the three
famous problems in different frameworks, classification of random
variables (to recognise if a random variable is discrete, or
continuous, or neither), distribution of discrete random variables
(definition, pmf, distribution function, properties of pmf and of
the distribution function, how to find F given f, how to find a
constant c from the expression of f, how to find probabilities of
type P(a<X<b)), distribution of continuous random variables
(definition, density, distribution function, properties of density
and of the distribution function, how to find F given f, how to
find f given F, how to find a constant c from the expression of f,
how to find probabilities of type P(X€J), J real interval)
Grade: Can go up to 220 points (200 points + 20 extra credit
points)
Review session: Friday, February 8, during class time
"Cheat sheet": Allowed (2 double paged sheets on which you have
written the essential formulas and theorems you need for the
respective exam)
SAMPLE
MIDTERM
1 (Fall 2009)
SOLUTIONS
SAMPLE
MIDTERM 1
SAMPLE
MIDTERM 1 (Fall 2011), Solution
MIDTERM 2: MIDTERM 2 +Solution, 2013
PRACTICE
MIDTERM 2, 2013
Date: Monday, March 25, 2013
Time:
During class time (Try to come 10 min earlier than the beginning
of the class!)
Topics: Conditional probability and trees, conditional probability
is a probability, product rule, law of total probability, Bayes
theorem, indicator random variables, definitions and
characterizations of independence, the distinction between
disjointness and independence, the distinction between pairwise
independence and independence, how to see independence on a tree,
definition and properties of expectation in the discrete case,
definition and properties of expectation in the continuous case,
expectation of the composition between a random variable and a
real function in both discrete and continuous case, definition and
properties of variance, definition, existence conditions and
properties of the moment generating function, importance of moment
generating function, how to compute moments knowing the mgf, how
to recover the distribution knowing the mgf, definition and
analysis of a discrete Uniform random variable, definition and
analysis of a Bernoulli random variable, definition and analysis
of a Binomial random variable, writing the Binomial as sum of
independent Bernoulli random variables, definition and analysis of
a Poisson random variable, how to write a Poisson as limit of
Binomial random variables, alternative characterization of a
Poisson random variable, exercises combining conditional
probability and Poisson random variables, definition and analysis
of a Geometric random variable, proof of the Memorylessness of the
Geometric random variable. Be also prepared for any kind of proofs
and exercises concerning the above topics (for instance to prove
one of the characterizations of independence, or to prove the
memoryless property of the geometric distribution, or to prove
that the moment generating function of the binomial has a certain
form, or to prove that the limit of the binomial distribution in a
certain sens leads to a Poisson distribution, or to prove that the
conditional probability is a probability, etc.)
Grade: Can go up to 220 points (200 points + 20 extra credit
points)
Review session: Friday, March 22, during class time
"Cheat sheet": Allowed (2 double paged sheets on which you have
written the essential formulas and theorems you need for the
respective exam)
SAMPLE
MIDTERM
2 (Fall 2009) Solutions
SAMPLE
MIDTERM 2 (Spring 2010), Solution
SAMPLE
MIDTERM 2 (Fall 2011), Solution
FINAL EXAM: Date: Apr 30, 2013
Time:
10:30am
Duration: 2h (Try to come 15 min earlier than the scheduled time!)
Location:
LILY G126
Review
Session: April 23, 6pm-8pm
Grade: Can go up to 420 points (400 points + 20 extra credit
points)
"Cheat sheet": Allowed (4 double paged sheets on which you have
written the essential formulas and theorems you need for the final
exam)
Topics: Everything ( Put accent on the following:
Independence of 2 events, independence of 3 events,
pairwise independence of 3
events, relations among them.Properties
of probability, be prepared to apply them in
exercises.Trees and conditional probability, The
Product rule, The Total probability formula, Bayes
Formula.How to find F given f, and viceversa, in
the continuous 1-dimensional and 2-dimensional
cases.The definition and properties of the
distribution function F in the 1-dimensional
case.The definition of the density in both
continuous and discrete case. How to find c s.t. f
is a density.The analysis of each type of discrete
and continuous random variables. From the discrete
ones put the accent on Poisson. From the
continuous ones put the accent on the Exponential
and the Normal.The connection between the Poisson
and the Exponential (with proof). The
memorylessness property of the Exponential. All
types of exercises cocerning the Exponential. The
reduction of an arbitrary Normal to the Standard
Normal (with proof). All kind of proofs and
exercises that require changes of variables involving
the Gaussian integral. The value of the Gaussian integral.
Exercises using the reduction to the Standard Normal.The
properties of the Gamma function, the definition of the
Gamma function. Computations involving changes of
variables bringing an integral to the integral from the
definition of the Gamma function. The relation between the
Gamma function and the Gaussian integral.Exercises with
the Uniform continuous distribution. The probability that
a U[a, b] r.v. takes values in a subinterval of [a,
b].Joint density, marginal and conditional densities,
etc., all step by step construction leading to the
conditional expectation and variance. To carry on all
steps of this construction starting from the joint density
in both discrete and continuous case.)
SAMPLE
FINAL
EXAM 2009 Solutions
SAMPLE
FINAL EXAM 2010
SAMPLE
QUIZ
1 ( Fall 2009)
Solution
SAMPLE
QUIZ 1 (Fall 2011), Solution
SAMPLE
QUIZ
2 (Fall 2009)
Solution
SAMPLE
QUIZ 2 (Fall 2011), Solution
Grades
All final grades will be
posted on Banner so that students will be able to
confidentially view them.
Grade Distribution
200 points: the homeworks
200 points: Midterm 1 (can go up to 220
due to the extra credit points)
200 points: Midterm 2 (can
go up to 220 due to the extra credit points)
400 points: Final exam
(can go up to 420 due to the extra
credit points)
TOTAL: 1000 points (can go up to
1060 points due to the extra credit points)
A: 900-1060
B:800-899
C:700-799
D:550-699
F:<550
In case of campus emergency any date can be subject to change. New
information will be announced on this webpage.
Syllabus
CHAPTER 1: INTRODUCTION (What is Probability Theory, Course
outline, Applications of Probability Theory)
CHAPTER 2: SOME FUNDAMENTAL CONCEPTS
2.1 Some fundamental concepts: sample space
2.2 Some fundamental results: sets properties, identification
sets=events
2.3 Random variables (definition and examples)
2.4 Basic concepts and results in counting ( a brief recall)
CHAPTER 3: THE CONCEPT OF PROBABILITY AND BASIC RESULTS
3.1 Definition of probability
3.2 Some of its basic properties and results
3.3 Distribution of a random variable
CHAPTER 4: CONDITIONAL PROBABILITY AND INDEPENDENCE
4.1 Conditional probability (definition, how to use trees in
solving problems, total probability theorem, Bayes formula)
4.2 Independent events (definition, characterisation, how to use
trees to analyse independence)
CHAPTER 5: NUMERICAL CHARACTERISTICS OF A RANDOM VARIABLE
5.1 Expectation, variance, moment-generating function
5.2 Some probability inequalities: Markov, Chebishev
CHAPTER 6: SOME SPECIAL DISTRIBUTIONS
6.1 Special discrete distributions: Discrete Uniform, Bernoulli,
Binomial, Geometric, Poisson, Hypergeometric
6.2 Special continuous distributions: Uniform, Exponential,
Normal, Gamma
CHAPTER 7: JOINT DENSITIES AND RELATED QUANTITIES
7.1 Joint distribution functions and joint densities
7.1 Marginal and conditional densities, conditional expectation,
conditional variance
CHAPTER 10: INDEPENDENT RANDOM VARIABLES
10.1 Independent random variables and criteria of independence
10.2 The reproductive property of certain distributions
CHAPTER 12: LAW OF LARGE NUMBERS AND CENTRAL LIMIT THEOREM
12.1 Convergence in distribution, in probability and almost sure
12.2 Law of large numbers and Central limit theorem