We reconsider the first setup above where we take samples to approximate the continuous functions \(f \colon X \rightarrow {\mathbb{R}}\) and \(g \colon Y \rightarrow {\mathbb{R}}\). Now we imagine that the region \(X\), where we sample \(f\), is miles away from the region \(Y\), where we sample \(g\). Then it may be impractical to use the same sensor for both \(f\) and \(g\). It would be more practical if someone who lived near \(X\) took the samples for \(f\) and someone else who lived near \(Y\) took the samples for \(g\), each of them using their own sensor. Now suppose that any value read from the sensor used for sampling \(f\) always differed by roughly the same additive constant \(r \in {\mathbb{R}}\) from the value that would have been read of the other sensor in the same location. The two sensors were so far apart however, that we could not measure this value \(r\) nor could we calibrate one sensor to match the other. To work around this dilemma we consider a slight modification of question 1.

**Question.**Is there a homeomorphism \(\varphi \colon X \rightarrow Y\) and a real number \(r\) such that \[ \xymatrix@C+10pt{ X \ar[r]^{\varphi} \ar[d]_f & Y \ar[d]^{g} \\ {\mathbb{R}}\ar[r]_{(r + \_)} & {\mathbb{R}}} \] commutes? (In other words \(r + f(p) = g(\varphi(p))\) for all \(p \in X\).)

Again this is a very narrow question and in nature the answer is
likely to be *no*. Nevertheless we may try to
quantify how *distant an affirmative answer
is*. To this end we specify \(\varepsilon \geq 0\) and
ask the following

**Question.**Is there a homeomorphism \(\varphi \colon X \rightarrow Y\) and a real number \(r\) such that for all \(p \in X\) the estimates \(-\varepsilon \leq r + f(p) - g(\varphi(p)) \leq \varepsilon\) hold?

Now we minimize over all \(\varepsilon\) that provide an affirmative answer.

**Definition.**Let \(\mu(f, g)\) be the infimum of all \(\varepsilon \geq 0\) such that the answer to the previous question is affirmative. We name this*the relative distance of \(f\) and \(g\)*.*Remark.*Though we won’t need it, we have the following alternative description for \(\mu(f, g)\) if \(X \neq \emptyset\). Let \(\mathcal{H}\) be the set of homeomorphisms from \(X\) to \(Y\), then \[\mu(f, g) = \frac{1}{2} \inf_{\varphi \in \mathcal{H}} \left( \sup_{p \in X} (f(p) - g(\varphi(p))) - \inf_{p \in X} (f(p) - g(\varphi(p))) \right) .\]

Completely analogously to the above we have a triangular inequality.

**Lemma**(Triangle Inequality)**.**Let \(h \colon Z \rightarrow {\mathbb{R}}\) be another continuous function, then \(\mu(f, h) \leq \mu(f, g) + \mu(g, h)\).

Moreover we have the following relation to the absolute distance.

**Lemma.**We have \(\mu(f, g) \leq M(f, g)\).

**Corollary**(Stability)**.**Suppose we have \({\lVert f - f' \rVert}_{\infty} = \varepsilon\), then \(\mu(f, f') \leq \varepsilon\).

*Proof.*This follows in conjunction with lemma 3.

The Triangle Inequality and Stability have the following

**Corollary.**Suppose we have \({\lVert f - f' \rVert}_{\infty} \leq \varepsilon \geq {\lVert g - g' \rVert}_{\infty}\),

then \(|\mu(f', g') - \mu(f, g)| \leq 2 \varepsilon\).

In summary \(\mu\) defines an extended pseudometric on the class of continuous functions and the above corollary shows that by approximating the functions \(f\) and \(g\), we can also approximate their relative distance.