Feeds:
Posts

## Optimizing parameters

I came across a fun problem recently that gave me a good opportunity to exercise my approximation muscles.

Problem: Compute $\displaystyle \lim_{n \to \infty} \frac{n + \sqrt{n} + \sqrt[3]{n} + ... + \sqrt[n]{n}}{n}$, if it exists.

The basic approach to such sums is that the first few terms contribute to the sum because they are large and the rest of the terms contribute to the sum because there are a lot of them, so it makes sense to approximate the two parts of the sum separately. This is an important idea, for example, in certain estimates in functional analysis.

Since $\sqrt[k]{n} \ge 1, k \ge 2$ it follows that the limit, if it exists, is at least $\lim_{n \to \infty} \frac{2n-1}{n} = 2$. In fact, this is the precise value of the limit. We’ll show this by giving progressively sharper estimates of the quantity

$\displaystyle E_n = \frac{1}{n} \sum_{k=2}^{n} \left( \sqrt[k]{n} - 1 \right)$.

In the discussion that follows I’m going to ignore a lot of error terms to simplify the computations.

First parameter

Suppose we split the sum after $s$ terms, where $s$ is a parameter to be optimized later. Then the sum of the first $s$ terms is bounded above by $\frac{s \sqrt{n}}{n}$, and the sum of the rest of the terms is bounded above by $n^{ \frac{1}{s+2} } - \frac{n-s}{n} = \exp \left( \frac{\log n}{s+2} \right) - 1 + \frac{s}{n}$. It’s clear that in order for this latter term to even stay bounded, $s$ needs to be on the order of $\log n$. However, the former term will go to zero as long as $s$ grows slower than $\sqrt{n}$. And as one term gets larger, the other gets smaller. So ideally, we want them to have the same growth rate for large $n$, and this should occur when $s$ has growth rate somewhere between $\sqrt{n}$ and $\log n$.

Well, if we suppose that $s$ grows fast enough so that $\exp \frac{\log n}{s + 2} - 1$ goes to zero, the latter term is asymptotic to $\frac{\log n}{s}$, so setting $\frac{s \sqrt{n}}{n} = \frac{\log n}{s}$ gives $s = n^{ \frac{1}{4} } \sqrt{\log n}$ as our optimum. (We don’t need to worry about rounding since disregarding finitely many terms of the sum doesn’t affect the limit.) This in turn gives

$\displaystyle E_n = O \left( n^{ - \frac{1}{4} } \sqrt{\log n} \right)$.

How sharp is this estimate? Well, we know that $E_n \ge n^{- \frac{1}{2} }$ (at least approximately), so there’s certainly room for improvement.

Second parameter

Now suppose we split the sum twice, first after $r$ and then after $r + s$ terms. Then the sum of the first $r$ terms is bounded above by $\frac{r \sqrt{n}}{n}$, the sum of the next $s$ terms is bounded above by $\frac{s n^{ \frac{1}{r+1} } }{n}$, and the sum of the rest of the terms is bounded above by $\frac{(n - r - s)(n^{ \frac{1}{r+s+1} } - 1) }{n} \approx \exp \left( \frac{\log n}{r+s} \right) - 1 \approx \frac{\log n}{r + s}$, again under the assumption that $r + s$ grows faster than $\log n$. Now we should compare the three growth rates

$\displaystyle \frac{r}{\sqrt{n}} = s n^{- \frac{r}{r+1} } = \frac{\log n}{r+s}$.

It’s somewhat less obvious what to do here. Under the assumption that $r$ is a fixed constant and that $s$ grows, equating the latter two growth rates gives

$\displaystyle s n^{ - \frac{r}{r+1} } = \frac{\log n}{s}$

hence $s = n^{ \frac{r}{2r+2} } \sqrt{\log n}$. By choosing larger and larger (but fixed) values of $r$, it follows that

$E_n = O \left( n^{-\frac{1}{2} + \epsilon} \right)$

for every $\epsilon > 0$. But this isn’t optimal, since the $\frac{r}{\sqrt{n}}$ term is smaller than it could be. If we allow $r$ to depend on $n$ then we have to be a little careful, since the effect of adding a constant to $r$ is to modify the middle growth rate considerably. With that in mind, thinking of the middle growth rate as $\frac{ sn^{ \frac{1}{r+1} } }{n}$ suggests that we try $r = c \log n - 1$ since we’ll get a constant in the numerator. This gives

$\displaystyle \frac{c \log n}{\sqrt{n}} = \frac{s e^{ \frac{1}{c} } }{n} = \frac{\log n}{s}$

hence $s = e^{- \frac{1}{2c}} \sqrt{n \log n}$, giving us a final growth rate of

$\displaystyle E_n = O(c n^{-\frac{1}{2}} \log n)$

for any choice of $c$. This suggests that we can do better. At this point the calculations get a little tedious, so I’m just going to tell you that an even better choice is

$\displaystyle r = \frac{\log n}{\log \log n - 2 \log \log \log n} - 1$

and that, after the appropriate calculations, it gives

$E_n = O \left( n^{- \frac{1}{2} } \frac{\log n}{\log \log n} \right)$.

A tight bound?

Of course, if we suspect that $E_n$ is very close to $O(n^{- \frac{1}{2} })$, then we shouldn’t be using the $\sqrt{n}$ term to do our estimates at all. In other words, we should really be estimating $E_n - \frac{1}{\sqrt{n}}$, i.e. starting from the $\sqrt[3]{n}$ term. This gives, essentially,

$\displaystyle \frac{r}{n^{ \frac{2}{3} } } = s n^{- \frac{r+1}{r+2} } = \frac{\log n}{s}$

The fact that the first term is now much smaller opens up the possibility that $r$ can grow like a power of $n$. Trying out $r = \sqrt[3]{n} - 2$ gives $\frac{s \exp \left( \frac{\log n}{\sqrt[3]{n}} \right)}{n} \approx \frac{s}{n} = \frac{\log n}{s}$, hence $s = \sqrt{n \log n}$ and

$\displaystyle E_n = O \left( n^{- \frac{1}{2} } \sqrt{\log n} \right)$

and it’s not hard to see that this is essentially the best we can do.

Given that adding additional parameters isn’t likely to help much because the behavior of the sum smooths out considerably after the first batch of terms, I’m inclined to think this bound is reasonably tight, but I’m curious if a more sophisticated method can do better. Any takers?

### 5 Responses

1. I have a simple solution for this limit. Note that
$$\frac{1}{n}\sum_{k=1}^nn^{1/k}=1+\frac{1}{n}\sum_{k=2}^{n}n^{1/k}.$$
Now we will show that
\begin{eqnarray}
\tag{1}
\lim_{n\to\infty}\frac{1}{n}\sum_{k=2}^{n}n^{1/k}=1.
\end{eqnarray}
Let $a_n=\sum_{k=2}^{n}n^{1/k}$ and $b_n=n$. By the Stolz-Cesaro theorem,
\begin{eqnarray*}
\lim_{n\to\infty}\frac{1}{n}\sum_{k=2}^{n}n^{1/k}&=&\lim_{n\to\infty}\frac{a_n}{b_n}\\
&=&\lim_{n\to\infty}\frac{a_{n+1}-a_n}{b_{n+1}-b_n}\\
&=&\lim_{n\to\infty}(a_{n+1}-a_n)\\
&=&\lim_{n\to\infty}\sum_{k=2}^{n+1}(n+1)^{1/k}-\sum_{k=2}^{n}n^{1/k}\\
&=&\lim_{n\to\infty}\left(\sum_{k=2}^{n}\left[(n+1)^{1/k}-n^{1/k}\right]+(n+1)^{1/(n+1)}\right)\\
&=&\lim_{n\to\infty}\sum_{k=2}^{n}\left[(n+1)^{1/k}-n^{1/k}\right]+1\\
&=&\lim_{n\to\infty}\sum_{k=2}^{n}\frac{1}{\sum_{i=0}^{k-1}(n+1)^{(k-1-i)/k}n^{i/k}}+1.
\end{eqnarray*}
Note that
$$\sum_{i=0}^{k-1}(n+1)^{(k-1-i)/k}n^{i/k}\ge k n^{(k-1)/k}\ge kn^{1/2}$$
and hence
\begin{eqnarray*}
0<\sum_{k=2}^{n}\frac{1}{\sum_{i=0}^{k-1}(n+1)^{(k-1-i)/k}n^{i/k}}\le\sum_{k=2}^{n}\frac{1}{kn^{1/2}}=\frac{1}{n^{1/2}}\sum_{k=2}^n\frac{1}{k}=\frac{1}{n^{1/2}}(\ln n-1+\gamma+o(1))\to 0
\end{eqnarray*}
as $n\to\infty$. So (1) is true. Thus
$$\lim_{n\to\infty}\frac{1}{n}\sum_{k=1}^nn^{1/k}=2.$$

2. Perhaps, using Stolz-Cesaro in the beginning might help in simplifying the solution a bit. This is not to suggest you hadn’t thought along that line.

• A friend of mine tried this, and it does let you compute the limit correctly, but you have to apply Stolz-Cesaro countably many times (!), at least the way he did it, and I don’t know of a version of Stolz-Cesaro that tells you what happens to the rate of convergence.

3. Hi, why not bound this sum above and below by integrals? You get <= \int_0^n n^{1/x} dx as an upper bound. Then plugging in n^{1/x} = e^{\ln n/x} = \sum_{i=0}^{\infty} (\ln n / x)^i 1/i!, and integrating each term gives a nice closed form formula.

• Have you carried out this strategy? Someone tried something very much like it on AoPS and it appears to force you to attempt an exchange of a sum and integral that isn’t valid.