Sequences

TagsMATH 115

PROOF TRICKS 🔨

The two modes of limits: to compute liman\lim a_n, you can treat it as-is and use certain tricks. Or, you can use the epsilon-definition and compute anL<ϵ|a_n - L| < \epsilon. The latter is the formal definition. It’s powerful but it can be hard to get right.

Roadmap to convergence 🔨

  1. If the sequence is bounded and monotonic, then it converges
    1. implicitly, we also know that it converges to the infimum or supremum
  1. If the sequence is monotonic in general, then it may diverge to infinity or converge
  1. If the sequence is not monotonic, then we need to use limsup,liminf\lim \sup, \lim \inf to make the judgement if the limit exists or not.
    1. However, we can also check the cauchy sequence property, because we’ve shown that a cauchy sequence is the same thing as a convergent sequence
    1. this is particularly useful if you have absolutely no idea what the limit could be, like in terms of a recursive sequence where you only get a bound, like anan+1<12n|a_n - a_{n+1}| < \frac{1}{2^n}

Sequence Definitions 🐧

A sequence is just a function with a discrete domain {nZ:nm}\{n\in \mathbb{Z} : n \geq m\} where m=0m = 0 or m=1m = 1. The function is typically denoted as sn:NRs_n : \mathbb{N} \rightarrow \mathbb{R}, and you might denote the lower and upper bounds as (sn)n=m(s_n)_{n=m}^\infty, or just (sn)nN(s_n)_{n\in \mathbb{N}}. Or, if the domain is well-established, we just use sns_n.

The sequence yielded by sns_n can have repeated values, but we call the set of values as just the unique values. Sequences are denoted by ()() and sets are denoted by {}\{\}.

The limit of a sequence is what sns_n approaches for large values of nn.

We say that a sequence is bounded if snM|s_n| \leq M for all nn.

Implicitly, this means that for finite values of nn, sn|s_n| is also finite.

Absolute value definition 🐧

(this helps for certain proofs with limits)

We define the following from an absolute value

Absolute value properties 🚀

These have three properties

We actually call (III) the triangle inequality and it is very useful.

Whenever you get something like ab>L|a - b| > L, it means that bb is within a certain radius of aa, so you get two inequalities from this.

Limits

Formal definition 🐧

A sequence sns_n converges to ss if for every ϵ>0\epsilon > 0 there exists some NN such that n>Nsns<ϵn > N → |s_n - s| < \epsilon.

Conversely, we say that a sequence diverges if no such number exists (and a formal defintion, see below). It doesn’t have to diverge to infinity. it can just oscillate, like (1)n(-1)^n.

In english, we say that for any non-trivial radius, we can find a number large enough to satisfy this radius. This is the fundamental definition of limits and we use it to prove limit properties that we use in calculus class. We also call these delta-epsilon proofs.

Here’s a good diagram that can help

Limits to infinity 🐧

If a limit limsn=\lim s_n = \infty, we use the following strict definition: for all M>0M > 0, there exists some NN such that n>Nn > N implies that sn>Ms_n > M.

In english, it means that you can always find a segment of the series larger than some number.

For negative infinity, we just say that for all M<0M < 0, there exists some NN such that n>Nn> N implies sn<Ms_n < M.

Uniqueness of limits 🚀

if limsn=s,limsn=t\lim s_n = s, \lim s_n = t, then s=ts = t

Tricks about proofs 🔨

When we prove a limit goes to ss, we let ϵ>0\epsilon > 0 and we want to end up at some definition of NN in terms of ϵ\epsilon such that n>Nsns<ϵn > N → |s_n - s| < \epsilon.

To do this, we often work backwards. We start with sns<ϵ|s_n - s| < \epsilon. Because sns_n is a function, we can often move things around in the inequality such that we have an inequality n>f(ϵ)n > f(\epsilon), and then we let N=f(ϵ)N = f(\epsilon), and then you show the forward direction.

For harder functions, you have a few tools in the toolkit

Disproving limits (proving divergence) 🔨

You can either

Limit Theorems

List of proven theorems 🚀

These limit theorems are ONLY VALID if the limits are finite. So for things like limn2\lim n^2, we can’t use anything below.
  1. If anbna_n \leq b_n for all n>Kn > K, then limanlimbn\lim a_n \leq \lim b_n
  1. Convergent sequences are bounded
  1. lim(ksn)=klimsn\lim(ks_n ) = k\lim s_n.
  1. lim(sn)+lim(tn)=lim(sn+tn)\lim(s_n) + \lim(t_n) = \lim(s_n + t_n)
  1. lim(sntn)=(limsn)(limtn)\lim(s_nt_n) = (\lim s_n)(\lim t_n)
    1. Corollary lim((sn)k)=(lim(sn))k\lim((s_n)^k) = (\lim(s_n))^k.
  1. lim(1sn)=1s\lim(\frac{1}{s_n}) = \frac{1}{s} if sn0s_n \neq 0
  1. lim(tnsn)=ts\lim(\frac{t_n}{s_n}) = \frac{t}{s}
  1. If anbncna_n \leq b_n \leq c_n and liman=limcn=r\lim a_n = \lim c_n = r, then limbn=r\lim b_n = r

Proven limit corollaries 🚀

Infinite limit theorems 🚀

  1. If limsn=\lim s_n = \infty and limtn>0\lim t_n > 0, then limsntn=\lim s_n t_n = \infty.
  1. limsn=\lim s_n = \infty if and only if lim(1sn)=0\lim(\frac{1}{s_n}) = 0. (assuming that sn>0s_n > 0.). The opposite is NOT true: if you know that limsn=0\lim s_n = 0, then you don’t know that lim1/sn\lim 1/s_n is. Take sn=(1)n/ns_n = (-1)^n/n, for example.

Computing vs proving limits 🔨

If you want to show that limsn=s\lim s_n = s, then you whip out your formal definitions and you might do some lower bounding, upper bounding, etc.

When they say “compute the limit using the formal definition”, then propose a limit and prove the rules

If you want to compute a limit, you need to work from the limit lemmas

Tricks for computing limits

Monotone Sequences

A sequence monotonically increases if an+1ana_{n+1} \geq a_n for all nNn \in \mathbb{N}. We are talking about monotone sequences because it can help us show convergence even if we don’t know the exact limit.

Bounded Monotone Theorem ⭐

Theorem (1): if a sequence is monotonically increasing and has an upper bound, then it converges.

Corollary: the limit of a bounded monotoically inceasing sequence is sup(S)\sup(S), where S={sn,nN}S = \{s_n, n \in \mathbb{N}\}. We just assumed that this is true and proved it in the proof above which also proved theorem 1.

Sub-corollary: if a sequence is monotonically increasing, liman>an\lim a_n > a_n for all nn. (comes from definition of supreum)

Theorem (2): if a sequence is monotonically decreasing and has a lower bound, then it converges. The proof is exactly the same approach as above, the the correlary is that the limit of a bounded monontonically decreasing sequence is inf(S)\inf(S).

Using this theorem 🔨

Sometimes, you can have sequences that are defined recursively and it’s very nasty to compute the actual limit. However, if you can show that the sequence is monotonicaly increasing and has an upper bound (or vice versa), you can show that it indeed has a limit.

More specifically, you need to show that a<sn+1<sna < s_{n+1} < s_n, which means that it’s a monotone sequence. You might proceed inductively and say that we assume that this is true and then show that sn+2<sn+1s_{n+2} < s_{n+1}. If this is legal, you will get back some fact about sn+1s_{n+1} which is assumed in the inductive hypothesis. Often, you need to provide a lower bound if you want to show that the sequence is decreasing (not just to use the theorem but also to show that it is monotonic). Conversely, you often need to provide an upper bound if you want to show that the sequence is increasing. These are just vague tips which can come in use.

Just remember…you need to prove the base cases too!

Application: Cauchy Cantor / Nested Interval Theorem 🧸

This theorem states that there exists at least one xx such that

xn=1[an,bn]x \in \bigcap^\infty_{n=1}[a_n, b_n]

Where [an+1,bn+1][an,bn][a_{n+1}, b_{n+1}] \subseteq [a_n, b_n]. Graphically, it means that if we apply tighter and tighter bounds, we will eventually converge to a non-empty interval.

Subsequences

Definition 🐧

A subsequence is defined as tkt_k, where kk indexes a sequence of monotonic numbers n1,,nk,nk+1,n_1, …, n_k, n_{k+1}, … which are used to index the original sequence sns_n. In other words,

tk=snkt_k = s_{n_k}

which we can understand as selecting a bunch of sns_n’s taken in order. All subsequences are infinite.

We can also understand it as a function composition. We can see the sequence sn=s(n)s_n = s(n), and our selection as σ(k)\sigma(k). The subsequence is tk=t(k)=s(σ(k))t_k = t(k) = s(\sigma(k)).

Same-subsequence theorem 🚀

If a sequence converges, then every subsequence converges to the same limit

As an aside, if a sequence diverges, it is not the case that every subsequence diverges. Consider the example an=(1)na_n = (-1)^n. You can select all odd nn and this converges.

As another aside, if a sequence has two subsequences with different limits, then the sequence diverges.

Boizano-Weirstrass Theorem ⭐🚀

Every bounded sequence has a convergent subsequence.

To prove this, it is sufficient to show that you can make a monotonic subsequence from any sequence, and then by the monotonic sequence theorem, we conclude that it converges.

Subsequence-convergence theorem (extra) 🚀

Theorem: if sns_n is a sequence, then

  1. There will be a subsequence of sns_n that converges to tt if and only if {nN:snt<ϵ}\{ n \in \mathbb{N} : |s_n - t| < \epsilon\} is infinite for all ϵ>0\epsilon > 0
  1. If sns_n is unbounded above, it has a subsequence with limit \infty
  1. If sns_n is unbounded below, it has subsequence with limit -\infty.

Subsequential limit (extra) 🐧

We define the subsequential limit as the limit of any subsequence of sns_n. Now, if sns_n converges, then we know that there is only one such limit. But if sns_n doesn’t have a limit originally, then we are in interesting territory. We can define {s}\{s\} as the set of subsequential limits of sns_n.

Subsequential limits and lim sup, lim inf (extra) 🚀

For any sequence, there exists a monotonic subsequence whose limit is limsupsn\lim \sup s_n and a monotonic subsequence whose limit is liminfsn\lim \inf s_n.

We can unify the idea of {s}\{s\} in this theorem:

if S={s}S = \{s\}, i.e. the set of subsequential limits of sns_n, then

  1. SS is non-empty
  1. sup(S)=limsupsn\sup(S) = \lim \sup s_n and infS=liminfsn\inf S = \lim \inf s_n
  1. limsn\lim s_n exists if and only if SS has only one element, limsn\lim s_n.

Lim Sup and Lim Inf

Defining limsup\lim \sup 🐧

Remember that sup\sup and inf\inf are both not defined if the sets are not bounded in their respective direction. sup=\sup = \infty is non-sensical.

We define the following

limsupsn=limNsup{sk:k>N}\lim \sup s_n = \lim_{N \rightarrow \infty} \sup \{s_k : k > N\}
liminfsn=limNinf{sk:k>N}\lim \inf s_n = \lim_{N \rightarrow \infty} \inf \{s_k : k > N\}

Again, unlike our discussion above, we don’t restrict sns_n to be bounded, so this can be infinite. In our previous discussion, we had sup{sn:nN}\sup \{s_n: n \in \mathbb{N}\}. Consider a function that decays over time to zero. The supreum will be much larger than limsupsn\lim \sup s_n, which goes down to zero.

Properties of lim-sup 🚀

The limsupsn\lim \sup s_n is basically taking the supremum of a bound, and the limit is drawing the bound tighter and tighter. Therefore, it must be the case that

limsupsnsupsN\lim \sup s_n \leq \sup s_N

for any NN. Simialrly,

liminfsninfsN\lim \inf s_n \geq \inf s_N

this is true for any inf and sup: if the modifying function is monotonic, then inff(sn)=f(inf(sn))\inf f(s_n) = f(\inf(s_n)). This is because it just moves the elements of the set without mixing up their order. Same for sup.

Lim-sup Theorem ⭐🚀

The bounded sequence {an}\{a_n\} converges if and only if liminfan=limsupan\lim \inf a_n = \lim \sup a_n.

Cauchy Sequences

The definition 🐧

A sequence is a Cauchy sequence if for every ϵ>0\epsilon > 0, there exists some NN such that m,n>Nm, n > N implies that snsm<ϵ|s_n - s_m| < \epsilon.

Graphically, it looks like a certain radius in which all pairwise elements are within some ϵ\epsilon distance of each other.

Cauchy equivalence to convergence ⭐🚀

Theorem: a sequence is a convergent sequence if and only if it’s a cauchy sequence

Using Cauchy for recursive definitions 🧸

Suppose that you had anan+1<12n|a_n - a_{n+1}| < \frac{1}{2^n}. Well, you can try to push it to the cauchy definition by computing anan+k|a_n - a_{n + k}|, which you do recursively: anan+k<anan+k1+an+k1+an+k|a_n - a_{n+k}| < |a_n - a_{n+k - 1}| + |a_{n+k - 1} + a_{n+k}|, and the first term you can compute recursively. For this particular problem, you get that anam<12n1|a_n - a_m| < \frac{1}{2^{n-1}},

Now, because 1/2k1/2^k decreases with larger kk, we can just set Nϵ=log2(1/ϵ)+1N_\epsilon = \log_2(1/\epsilon) + 1, which means and we see that for all n,m>Nϵn, m > N_\epsilon, the equality is satisfied anam<ϵ|a_n - a_m| < \epsilon.