Fall 2018, problem 71

Find all functions $f:\mathbb{R}\to\mathbb{R}$ that satisfy the condition $\displaystyle\qquad f\left(f\left(x\right)+y\right)=2x+f\left(f\left(y\right)-x\right) \quad\text{for all }x,y\in\mathbb{R}$.


3 years ago

Assuming $f$ is differentiable over all reals, let us differentiate the above functional equation with respect to $x$ and $y$:

$f'(x) \cdot f'(f(x) + y) = 2 - f'(f(y)-x)$ (i)

$f'(f(x)+y) = f'(y) \cdot f'(f(y)-x)$ (ii)

which after substituting (ii) into (i), one gets:

$f'(x)f'(y)f'(f(y)-x) = 2 - f'(f(y)-x) \Rightarrow f'(f(y) -x) = \frac{2}{1 + f'(x)f'(y)}$ (iii).

Now if we differentiate (iii) with respect to $x$ and $y$, we get:

$-f''(f(y)-x) = -\frac{2f''(x)f'(y)}{[1+f'(x)f'(y)]^2} $(iv)

$f'(y) \cdot f''(f(y)-x) = -\frac{2f'(x)f''(y)}{[1+f'(x)f'(y)]^2} $(v)

and substituting (iv) into (v) produces:

$f'(y) \cdot \frac{2f''(x)f'(y)}{[1+f'(x)f'(y)]^2} = -\frac{2f'(x)f''(y)}{[1+f'(x)f'(y)]^2} \Rightarrow \frac{f''(x)}{f'(x)} = -\frac{f''(y)}{[f'(y)]^2}$ (vi)

Now suppose $\frac{f''(x)}{f'(x)} = A$ (for $A \in \mathbb{R}$). Integrating once gives $ln[f'(x)] = Ax + B \Rightarrow f'(x) = e^{Ax+B}$, and integrating again gives $f(x) = (\frac{1}{A})e^{Ax+B} + C$ (for $A,B,C \in \mathbb{R}$) (vii). If we check (vii) back into the original functional equation, we get a contradiction since the difference of two exponential functions:

$f(f(x)+y) - f(f(y)-x) = 2x$

cannot equal a linear function for all $x,y \in \mathbb{R}$.

Returning to (vi), if we now suppose $-\frac{f''(y)}{[f'(y)]^2} = D$ (for$D \in \mathbb{R}$), a first integration gives:

$\frac{1}{f'(y)} = Dy + E \Rightarrow f'(y) = \frac{1}{Dy+E}$

followed by a second integration gives $f(y) = (\frac{1}{D}) \cdot ln(Dy+E) + F$ (for$D,E,F \in \mathbb{R}$) (viii). Similarly to (vii), the difference of two logarithmic functions:

$f(f(x) +y) - f(f(y)-x) = 2x$

cannot equal a linear function for all $x,y \in \mathbb{R}$.

Hence, the original functional equation cannot be satisfied for any nonlinear $f$. This forces us to consider the linear case as our only solution since (vi) is also satisfied when $f''(x) = f''(y) = 0$ for all $x,y \in \mathbb{R}$. Let us consider $f(x) = Gx + H$ (for $G,H \in \mathbb{R}$). Substituting this expression into the original functional equation now yields:

$f(f(x)+y) = 2x + f(f(y)-x) \Rightarrow G[(Gx+H) + y] + H = 2x + G[(Gy+H)-x] + H \Rightarrow G^2x +GH + Gy = 2x + G^2y + GH - Gx$ (ix).

If we match the coefficients in (ix), we now get:

For $x$: $G^2 = 2 - G \Rightarrow G^2 + G - 2 = 0 \Rightarrow (G+2)(G-1) = 0 \Rightarrow G = -2, 1$ (x)

For $y$: $G = G^2 \Rightarrow G^2 - G = 0 \Rightarrow G(G-1) = 0 \Rightarrow G = 0, 1$ (xi)

Both (x) and (xi) are satisfied when $G = 1$ and for all $H \in \mathbb{R}$. This result shows that the solution to our original functional equation is the set of all linear functions $\boxed{f(x) = x + H}$.

How did you assume it is differentiable. I saw your other answers as well where you assumed the property of being differential and fortunately the functions were nice. But your solution is not complete. I can assume the function is injective and get the answer right away.

chorgeshashank1729 3 years ago

Now, set y0 = m mod 3, and z0 = n mod 3 if either y0 = 0 or z0 = 0, the entire board can be covered using 3 x 1 pieces using line 2. If y0 does not equal 0 and z0 also does not equal 0, divide the board into four zones. jumpstart service

blareblare20 1 month ago
3 years ago

I don't know what all the functions might be, but I believe f(x) = x + c works.

Yes, if f(x) is injective, it has to be f(x)=x+C.

Subsituting x=0 into the equation and denoting C=f(0) gives f(C+y)=f(f(y)). So if f is injective, we must have f(y)=C+y for any y. And we can verify that this indeed satisfies the original function equation.

But I don't know how to proceed when f is not injective or to disprove this possibility...

JL 3 years ago
3 years ago

$f(f(x)+y)=2x+f(f(y)-x)$ $(1)$.

In $(1)$, consider $ y=-f(x)$. Then $ f(0)=2x+ f(f(f(-x))-x) \iff f(f(f(-x))-x)= f(0)-2x$. This means $f$ is surjective ($f(0)- 2x$ runs over all the real numbers).

Then, there is $x_{0}$ such that $f(x_{0})=0 $, and $(1)$, making this substitution, becomes $f(y)= 2x_{0}+f(f(y)- x_{0})$ which is equivalent to

$f(y)-x_{0}=x_{0} +f(f(y)-x_{0})$ $(2)$.

Let $x$ be any real number. As $f$ is surjective, there is $y$ such that $x= f(y)-x_{0}$, and $(2)$ becomes

$x=x_{0}+f(x)$. This means that

$f(x)=x-x_{0} $.