Proving convergence of a sequence
Exercise (Convergence of a sequence 1)
Use the epsilon-definition to prove that the sequence
converges. What is its limit?
Exercise (Convergence of a sequence 2)
Use the epsilon-definition to show that the sequence
converges to
.
Proving divergence for an alternating sequence
Exercise (Divergence of an alternating sequence)
Prove that the sequence
diverges.
How to get to the proof? (Divergence of an alternating sequence)
We need to show that
Intuitively, as the sequence alternates between
and
, the only possible limits are
and
. However, we have to show the above statement for all
in order to prove divergence.
First, let us consider the case
: For all even
there is
, so
That means, we are further away from
than
. For any
there will be an even
with
, such that
Analogously, for
and all odd
:
, since
So if we choose again
, then for each
there will be an odd
with
, such that
I.e. for all
, the sequence has infinitely many elements staying away by more than 1 from
and hence, the sequence diverges.
Powers of sequences
Limit theorems
Solution (Limits of sequences)
1. The trick is to split the roots
and
in factors
and
. We can then use that
and
and patch them together, using the limit theorems:
2. For fractions of polynomials, the "polynomial with highest degree" usually wins. We have two polynomials of equal degree 3. In that case, one has a finite limit. we can determine it by factoring out the highest power and using the limit theorems.
3. It looks as if the enumerator was of degree 2 and the denominator of degree 1, so one might think at first glance that the enumerator "wins" and we get divergence to infinity. However, there is a minus sign in the enumerator, so we need to do some simplification first. indeed, the
-terms cancel and we get a finite limit:
4. What will happen for large
? At first glance, we go to
which is no determinate result. but after some simplifications:
Alternative solution: (squeeze theorem)
There is
. The squeeze theorem hence yields
.
5. Again a square root. For large
, there will be
So both enumerator and denominator behave like a polynomial of degree 1. We factor out an
and obtain a finite limit:
6. The long sum can be simplified using Gauss' sum formula
. Then, we get two polynomials of degree 2, where we can factor out an
:
Exercise (Sequences depending on a parameter)
Investigate whether
with
converges, depending on the parameter
.
Exponential sequences
Solution (Exponential sequences)
1.We make use of
:
2. As above, by
:
3. This case requires a little index shift
4. In order to apply
, we will extend both the enumerator and the denominator by
,
da
Alternative solution: If we already know that
, then we directly get
,
Squeeze theorem
Exercise (Squeeze theorem for products of roots)
Prove that
.
Solution (Squeeze theorem for products of roots)
It is easy to see that the sequence elements are greater than 0. So we need an upper bounding sequence for
which converges to 0. That means, we have to show that
goes to 0 fast enough. More explicitly, we need that
decreases as a sufficiently large power of
. We consider
:
This implies
We can replace the
by a sufficient power of
So there is also an upper bounding sequence converging to 0 and by the squeeze theorem
.
Monotonicity criterion and recursively defined sequences
Exercise (Monotonicity criterion)
Let
be a sequence with
for all
. Show, using the monotonicity criterion that in this case the product sequence
defined as
converges.
Solution (Monotonicity criterion)
This can be seen as a recursively defined sequence with
, but one where we know the explicit form.
Step 1:
is monotonically decreasing, i.e.
.
This is easy to see, as sequence elements tend to get smaller:
Step 2:
is bounded from below by
, i.e
.
This can be shown by induction in
:
Induction basis:
.
.
Induction step:
Step 3:
actually converges.
This is a direct consequence of the monotonicity criterion:
is monotonically decreasing and bounded, so it converges.
Exercise (Convergence of a recursively defined sequence 1)
Why does the recursively defined sequence
converge? Determine its limit:
- By finding an explicit form of the sequence
- Using the monotonicity criterion
Solution (Convergence of a recursively defined sequence 1)
Part 1: Let us investigate the first sequence elements
Can we find a pattern? With a bit of training, one might see:
So we assert
for all
. Let us prove the assertion by induction over
:
Induction basis:
.
.
Induction step:
This verifies our assertion
for all
(even though it was not easy to find). We can directly compute the limit of this expression using the limit theorems:
Part 2: is solved in 4 steps:
Step 1:
is monotonically increasing, i.e.
.
The proof runs by induction over
:
Induction basis:
.
.
Induction step:
Step 2:
is bounded from above by
, i.e.
.
This can also be shown inductively:
Induction basis:
.
.
Induction step:
Step 3:
converges.
As
is increasing and bounden, the monotonicity criterion can be applied and
converges.
Step 4: computing the limit.
An index shift will not affect the limit:
. But plugging in the recursion formula into this equation and using the limit theorems , we will obtain the limit:
We resolve for
:
And the limit is
(same as in part 1).
Exercise (convergence of a recursively defined sequence 2)
Let
. Why does the recursively defined sequence
converge? What is its limit?
Solution (convergence of a recursively defined sequence 2)
There is
We apply this step
times and get
So all differences
are multiples of
We obtain
by adding up all differences (telescoping sum)
and get a geometric series:
There is
so the limit theorems yield
Exercise (Monotonicity criterion for sequences)
Show the the sequence
defined by the recursion relation
converges to the so-called golden ratio
.
How to get to the proof? (Monotonicity criterion for sequences)
At first, we need a proof of convergence. Then, we can compute the limit. So first, let us show that
is monotone and bounded (then, the monotonicity criterion can be applied).
Why is it monotone? Let us compute the first sequence elements:
We assert that
is monotonically increasing. Only a proof is required, e.g. by induction.
Then, we need an upper bound. The limit of
shall be
, so any number greater than
will be an upper bound. Since
w simply choose
as an upper bound for
. Of course, we need to verify that this is an upper bound by induction.
At this point, we have convergence of the sequence. Then, we need to show that
is the limit and we are done.
Solution (Monotonicity criterion for sequences)
Part 1: is done by induction over
:
Induction basis:
.
Induction step:
Part 2: is also done by induction over
:
- At first, we show that the odd elements are monotonically increasing, i.e.
for all
.
Induction basis:
.
Induction step:
- Now, monotone decrease for even elements is shown, meaning
for all
.
Induction basis:
.
Induction step:
Part 3: By means of parts 1 and 2, we have that
is monotonically decreasing,
is monotonically increasing and both are bounded. The monotonicity criterion implies convergence of both subsequences.
Now,
, since
, so there is
We resolve the quadratic equation, obtaining two solutions:
and
By means of part 1, there is
, and hence
. So the latter solution must be the limit:
.
Analogously,
. Both subsequences converge to the same limit, so there must be
which finishes the proof.
Cauchy's limit theorem and the Cesàro mean
Solution (Cauchy's limit theorem)
- Since
converges to
, for any
there must be an
such that for all
there is:
But now, the sequence of means
in
will be dominated by terms
if
is much larger than
. So for a large enough
, the mean drops below
. This can be done with even smaller
, so for
, there is even:
For
there is now
and we have convergence. - No, the converse does not hold true. A counterexample is the sequence
. It diverges (see the corresponding exercise above). However, for the Cesàro mean, there is
This is obviously a null sequence. - We apply Cauchy's limit theorem using
. Since
there is also
.