Technology and Art
This post lists solutions to the exercises in the Convergence, Cauchy Sequences, and Completeness section 1.4 of Erwin Kreyszig’s Introductory Functional Analysis with Applications. This is a work in progress, and proofs may be refined over time.
Proof:
Suppose (xn) is convergent and xn→x. Let (xnk) be a subsequence. Let xnm correspond to xi.
Since (xn) is convergent, ∀ϵ>0,∃N, such that d(xn,x)<ϵ for n>N. Pick p≥N, such that xp exists in (xnk) and is identified as xnm.
This implies that ∀ϵ>0,∃M, such that d(xnm,x)<ϵ.
The above is the definition of the convergence of a sequence to a limit. Thus, (xnk) converges to x.
Alternatively, you can prove that the limit of (xnk) is x in the following manner.
Suppose xnk→y. Then, by the Triangle Inequality, we have:
Both d(x,xp) and d(xnm,y) can be made as small as possible, since (xn) and (xnk) converge, implying that d(x,y) is smaller than any positive value. Thus:
d(x,y)≤d(x,xp)+d(xnm,y)⇒d(x,y)<ϵ1+ϵ2,ϵ1,ϵ2>0⇒d(x,y)=0Thus, x=y, i.e., xnk has the same limit as (xn).
◼Proof:
Suppose (xn) is Cauchy. Let (xnk) be a subsequence. Let xnm correspond to xi.
Pick N0=max. Then the above two statements become:
(Note that we picked the same i for both (x_n) and (x_{n_k}) because any value greater than N_0 will fulfil the above conditions, so we might as well pick the same index. For any index i, we can use x_i and x_{n_i} interchangeably, since they index the same element in both the sequence and the subsequence.)
By the Triangle Inequality, we have:
d(x_j,x) \leq d(x_j, x_i) + d(x_i,x) \\ \Rightarrow d(x_j,x) \leq d(x_j, x_i) + d(x_{n_i},x) \\ \Rightarrow d(x_j,x) \leq \epsilon + \epsilon \\ \Rightarrow d(x_j,x) < 2\epsilon\epsilon can be made as small as possible, implying that d(x_j,x) is smaller than any positive value. Thus:
d(x_j,x)=0Hence, (x_n) is convergent with the limit x.
\blacksquareProof:
Suppose that x_n \rightarrow x. Then \forall \epsilon>0, \exists N_0, such that d(x_n,x)<\epsilon for all n>N_0. This implies that a neighbourhood V_\epsilon of x exists, which contains all x_{n>N_0}. Since there are an infinite number of values for \epsilon, it follows that this applies to every neighbourhood of x.
Conversely, suppose that for every neighborhood V of x there is an integer n_0 such that x_n \in V for all n > n_0. Assume each neighbourhood has a size of \epsilon. Thus, x_n \in V implies that d(x_n,x)<\epsilon. Then, we can restate this as the following: \forall \epsilon>0, \exists N_0, such that d(x_n,x)<\epsilon for all n>N_0.
\blacksquareProof:
By definition, for a Cauchy sequence, we have: \forall \epsilon>0, \exists N_0, such that d(x_m,x_n)<\epsilon for all $m,n>N_0$$.
Choose \epsilon=1. Then, assume the value of N_0 to be N_1. For any d(x_a,x_b), we have:
Combining these upper bounds, we get: \sup d(x_a,x_b) < a+1
\blacksquareAnswer:
Consider the discrete metric on \mathbb{R}. If we have a sequence (x_n)=0,1,0,1,\cdots, then the series is bounded because \sup d(x_m, x_n)=1, but for \epsilon=\frac{1}{2}, there is no N for which d(x_m,x_n)<\epsilon for m,n>N. Thus, the sequence is not Cauchy, though it is bounded.
Convergence is sufficient for a sequence to be Cauchy. For convergence, we have the condition: if x_n \rightarrow x, \forall \epsilon>0, \exists N_0, such that d(x_n,x)<\epsilon for all n>N_0.
Consider m,n>N_0. Then, by the Triangle Inequality, we have:
d(x_m,x_n) \leq d(x_m,x) + d(x,x_n) < \epsilon + \epsilon = 2 \epsilonthus proving the Cauchy criterion.
Proof:
d(x_m,x_n)<\epsilon \\ d(y_m,y_n)<\epsilonThen, we have:
d(x_m,y_m) \leq d(x_m,x_n) + d(x_n,y_n) + d(y_n,y_m) \\ \Rightarrow d(x_m,y_m) - d(x_n,y_n) \leq d(x_m,x_n) + d(y_n,y_m) \\ \Rightarrow d(x_m,y_m) - d(x_n,y_n) < 2 \epsilonSimilarly, we have:
d(x_n,y_n) \leq d(x_n,x_m) + d(x_m,y_m) + d(y_m,y_n) \\ d(x_n,y_n) - d(x_m,y_m) \leq d(x_n,x_m) + d(y_m,y_n) \\ \Rightarrow d(x_n,y_n) - d(x_m,y_m) < 2 \epsilonThe above inequalities imply that: \vert d(x_m,y_m) - d(x_n,y_n) \vert < 2 \epsilon \\ d[d(x_m,y_m) - d(x_n,y_n)] < 2 \epsilon
This implies that a_n=d(x_n,y_n) is Cauchy, and thus converges.
\blacksquareLemma 1.4-2(b) is: Let X=(X,d) be a metric space. Then, if x_n \rightarrow x and y_n \rightarrow y, then d(x_n,y_n) \rightarrow d(x,y).
Proof:
We have x_n \rightarrow x and y_n \rightarrow y. Then (x_n) and (y_n) are Cauchy. Thus the following two statements hold true:
Taking N_0=\max(M,N), the above statements become:
\forall \epsilon/2>0, \exists N_0 such that d(x_m,x_n)<\epsilon/2 and d(y_m,y_n)<\epsilon/2 for m,n>N_0, i.e., d(x_m,x_n)+d(y_m,y_n)<\epsilon/2+\epsilon/2=\epsilon
We will prove the result using proof by contradiction.
Suppose (a_n)=(d(x_n,y_n)) Suppose the claim is not true. Then, \require{cancel} a_n \cancel\rightarrow a, thus (a_n) is not Cauchy. This implies that: \exists \epsilon such that \forall N, we have d(a_m, a_n)>\epsilon for all m,n>N.
By the Triangle Inequality, we have:
d(x_m,y_m) \leq d(x_m,x_n) + d(x_n,y_n) + d(y_n,y_m) \\ \Rightarrow d(x_m,x_n) + d(y_n,y_m) \geq d(x_m,y_m) - d(x_n,y_n) \\ \Rightarrow d(x_m,x_n) + d(y_n,y_m) > \epsilon \\This is then true for arbitrary \epsilon. But, this implies that for all N, we cannot make d(x_m,x_n)+d(y_m,y_n)<\epsilon. This is a contradiction, since by assumption, we have
d(x_m,x_n)+d(y_m,y_n)<\epsilonThus (a_n) is Cauchy, and is thus a convergent sequence.
\blacksquareProof:
We wish to show that if L_1 is the limit of a Cauchy sequence in (X,d_1) (call it x_n(d_1))and L_2 is the limit of a Cauchy sequence in (X,d_2) (call it x_n(d_2)), then L_1=L_2.
We have \forall x, y \in X, a.d_1(x,y) \leq d_2(x,y) \leq b.d_1(x,y).
Then, the Triangle Inequality gives us:
d_1(L_1,L_2) \leq d_1(L_1,x) + d_1(x,L_2)Applying the given metric constraints:
d_1(L_1,L_2) \leq d_1(L_1,x) + \frac{1}{a} d_2(x,L_2)We know that x_n(d_1) \rightarrow L_1 and x_n(d_2) \rightarrow L_2, therefore. If we have the following:
Pick N_0=\max(M,N), so that the above holds true for N_0.
Then d_1(L_1,x_{N_0})<\epsilon and d_2(L_2,x_{N_0})<\epsilon, so that we get:
d_1(L_1,L_2) < \epsilon \left(1+\frac{1}{a} \right)Since the above is true for all \epsilon>0, we can conclude that d_1(L_1,L_2)=0. Hence L_1=L_2.
The same procedure can also be showing using d_2.
\blacksquareThe three distance metrics mentioned are:
Proof:
For d_1 and d_2, let’s determine the conditions.
x+y \leq \sqrt{x^2+y^2} \\ x^2+y^2+2xy \leq x^2+y^2This gives us 2xy \leq 0, so that’s invalid; if we however we introduce a \sqrt{2} on the right hand side, we get:
x+y \leq \sqrt{2(x^2+y^2)} \\ x^2+y^2+2xy \leq 2x^2+2y^2 \\ x^2+y^2-2xy \geq 0which works. For the reverse inequality x+y \geq \sqrt{x^2+y^2}, note that we immediately get 2xy>0, which works, so we can write the combined inequalities as:
\sqrt{x^2+y^2} \leq x+y \leq \sqrt{2} \sqrt{x^2+y^2} \\ \Rightarrow d_2(x,y) \leq d_1(x,y) \leq \sqrt{2} d_2(x,y)For d_1 and d_max, note that x+y> \geq \max(x,y) and 2 \max(x,y) \geq x+y
Then, we get:
\max(x,y) \leq x+y \leq 2 \text{ max }(x,y) \\ \Rightarrow d_{max}(x,y) \leq d_1(x,y) \leq 2 d_{ max}(x,y)For d_2 and d_max, note that x^2+y^2> \geq {\max(x,y)}^2 and 2 {\text{ max }(x,y)}^2 \geq x^2+y^2
Then, we get:
\max(x,y) \leq \sqrt{x^2+y^2} \leq 2 \text{ max }(x,y) \\ \Rightarrow d_{max}(x,y) \leq d_2(x,y) \leq 2 d_{max}(x,y) \blacksquareProof:
Assume \mathbb{R} is complete.
Assume two Cauchy Sequences in \mathbb{R}:
Construct a sequence in \mathbb{C}, like so:
(z_n)=x_1+iy_1,x_2+iy_2,\cdotsAssume the distance metric for \mathbb{Z} is d(z_1,z_2)=\sqrt{ {(x_1-x_2)}^2 + {(y_1-y_2)}^2}.
Pick N_0=\max(M,N), so that the above holds true for N_0.
Pick z_i so that i>N_0. Assume z=x+iy. Then, we have:
d(z_i,z)=\sqrt{ {(x_i-x)}^2 + {(y_i-y)}^2}<\epsilonThen, for an arbitrary \epsilon, there exists N_0, such that d(z_i,z)<\epsilon. Furthermore z \in \mathbb{C}. Thus, \mathbb{C} contains the limits of all its Cauchy sequences. Thus, it is a closed set; hence it is a complete metric space.
\blacksquare