Just a quick point.
I believe this construction of hairs will only work locally. As,
\(
\text{arctet}(\exp^{[r]}(1+i)) = 1+i + r\\
\)
Is only doable for local values. This is because arctet is undefined at all periodic points, and these make up a large portion of \( \mathbb{C} \) (sort of popping up everywhere); and additionally that we can't define \( \exp^{[r]} \) without a significant discussion of branch cuts (which slog, etc...). But this seems interesting.
You are definitely correct though, that as \( \Re(s_1) \to \infty \) we should get a flatter and flatter line; as h will get closer to looking like \( s_1 + 1 + o(e^{-\lambda s_1}) \) (if we take \( f = \beta_\lambda \)). I'd just like to add that your formula should be,
\(
\text{arc}\beta_\lambda(\exp^{[r]}) = h^{[r]}(\text{arc}\beta_\lambda)
\)
And for large initial values \( s \) we get that \( h^{[r]}(s) = s+r+\sum_j o(e^{-\lambda (s_j+r)}) \) for a sequence \( s_j \).
I think iterating \( h \) is necessary to construct the super logarithm in a nice way. Where if we iterate \( h \) then we are equivalently iterating \( \exp \) once we conjugate by \( \beta_\lambda \). This is something I've been fiddling with since you posted it. I think I have to use this to describe my solution better; and once and for all show that \( \text{tet}_\beta(s) \to \infty \) as \( \Im(s) \to \infty \). That this solution is not Kneser's solution.
Regards, James.
I believe this construction of hairs will only work locally. As,
\(
\text{arctet}(\exp^{[r]}(1+i)) = 1+i + r\\
\)
Is only doable for local values. This is because arctet is undefined at all periodic points, and these make up a large portion of \( \mathbb{C} \) (sort of popping up everywhere); and additionally that we can't define \( \exp^{[r]} \) without a significant discussion of branch cuts (which slog, etc...). But this seems interesting.
You are definitely correct though, that as \( \Re(s_1) \to \infty \) we should get a flatter and flatter line; as h will get closer to looking like \( s_1 + 1 + o(e^{-\lambda s_1}) \) (if we take \( f = \beta_\lambda \)). I'd just like to add that your formula should be,
\(
\text{arc}\beta_\lambda(\exp^{[r]}) = h^{[r]}(\text{arc}\beta_\lambda)
\)
And for large initial values \( s \) we get that \( h^{[r]}(s) = s+r+\sum_j o(e^{-\lambda (s_j+r)}) \) for a sequence \( s_j \).
I think iterating \( h \) is necessary to construct the super logarithm in a nice way. Where if we iterate \( h \) then we are equivalently iterating \( \exp \) once we conjugate by \( \beta_\lambda \). This is something I've been fiddling with since you posted it. I think I have to use this to describe my solution better; and once and for all show that \( \text{tet}_\beta(s) \to \infty \) as \( \Im(s) \to \infty \). That this solution is not Kneser's solution.
Regards, James.

