'Inverting' the Babylonian Method to Get Lower Approximations of a Square Root (making the approximation...












1












$begingroup$


Update: I'm changing the question to




What is the best way to exploit this idea?




(I tried but everything goes in a circle; you wind up implementing the Babylonian Method).





Let $S gt 0$.



If one 'runs' the Babylonian Method in the usual fashion, we get a decreasing sequence that converges to $sqrt S$. By using an inversion trick, we can get an increasing sequence with the same limit:



With seed $l_0 gt 0$, define



$$tag 1 l_{n+1} = frac{2, S, l_n}{S + l_n^{;2}}$$



I performed some googling trying to find this recursion, but came up empty handed. I would be surprised if it isn't already an algorithm known to numerical analysts.




What is the name of this technique?




We can combine (1) with the Babylonian method,



$$tag 2 u_{n+1} = frac{1}{2} , (u_n + frac{S}{u_n}) $$



to get convergence with built-in error bounds.



Here is a Python program implementing the proposed algorithm; it can be contrasted with the wikipedia example of the Babylonian method.



#--------*---------*---------*---------*---------*---------*---------*---------*
# Desc: Calculate square root of 125,348
#--------*---------*---------*---------*---------*---------*---------*---------*

S = 125348
u = 600 # rough estimation of an over-estimate

print('+', u)

for i in range(0,5):
if i % 2 == 0: # proess '+', an over-estimate
l = (2 * S * u) / (S + u * u)
print('-', l)
else: # proess '-', an under-estimate
u = .5 * (l + S/l)
print('+', u)


* OUTPUT *



+ 600
- 309.91700800250544
+ 357.186837334586
- 354.03137921119
+ 354.0451951246895
- 354.04519485512014









share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
    $endgroup$
    – LutzL
    Jan 6 at 9:23






  • 1




    $begingroup$
    There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
    $endgroup$
    – Martin Rosenau
    Jan 6 at 9:25










  • $begingroup$
    @MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
    $endgroup$
    – CopyPasteIt
    Jan 6 at 11:03






  • 1




    $begingroup$
    We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
    $endgroup$
    – achille hui
    Jan 8 at 2:54
















1












$begingroup$


Update: I'm changing the question to




What is the best way to exploit this idea?




(I tried but everything goes in a circle; you wind up implementing the Babylonian Method).





Let $S gt 0$.



If one 'runs' the Babylonian Method in the usual fashion, we get a decreasing sequence that converges to $sqrt S$. By using an inversion trick, we can get an increasing sequence with the same limit:



With seed $l_0 gt 0$, define



$$tag 1 l_{n+1} = frac{2, S, l_n}{S + l_n^{;2}}$$



I performed some googling trying to find this recursion, but came up empty handed. I would be surprised if it isn't already an algorithm known to numerical analysts.




What is the name of this technique?




We can combine (1) with the Babylonian method,



$$tag 2 u_{n+1} = frac{1}{2} , (u_n + frac{S}{u_n}) $$



to get convergence with built-in error bounds.



Here is a Python program implementing the proposed algorithm; it can be contrasted with the wikipedia example of the Babylonian method.



#--------*---------*---------*---------*---------*---------*---------*---------*
# Desc: Calculate square root of 125,348
#--------*---------*---------*---------*---------*---------*---------*---------*

S = 125348
u = 600 # rough estimation of an over-estimate

print('+', u)

for i in range(0,5):
if i % 2 == 0: # proess '+', an over-estimate
l = (2 * S * u) / (S + u * u)
print('-', l)
else: # proess '-', an under-estimate
u = .5 * (l + S/l)
print('+', u)


* OUTPUT *



+ 600
- 309.91700800250544
+ 357.186837334586
- 354.03137921119
+ 354.0451951246895
- 354.04519485512014









share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
    $endgroup$
    – LutzL
    Jan 6 at 9:23






  • 1




    $begingroup$
    There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
    $endgroup$
    – Martin Rosenau
    Jan 6 at 9:25










  • $begingroup$
    @MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
    $endgroup$
    – CopyPasteIt
    Jan 6 at 11:03






  • 1




    $begingroup$
    We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
    $endgroup$
    – achille hui
    Jan 8 at 2:54














1












1








1





$begingroup$


Update: I'm changing the question to




What is the best way to exploit this idea?




(I tried but everything goes in a circle; you wind up implementing the Babylonian Method).





Let $S gt 0$.



If one 'runs' the Babylonian Method in the usual fashion, we get a decreasing sequence that converges to $sqrt S$. By using an inversion trick, we can get an increasing sequence with the same limit:



With seed $l_0 gt 0$, define



$$tag 1 l_{n+1} = frac{2, S, l_n}{S + l_n^{;2}}$$



I performed some googling trying to find this recursion, but came up empty handed. I would be surprised if it isn't already an algorithm known to numerical analysts.




What is the name of this technique?




We can combine (1) with the Babylonian method,



$$tag 2 u_{n+1} = frac{1}{2} , (u_n + frac{S}{u_n}) $$



to get convergence with built-in error bounds.



Here is a Python program implementing the proposed algorithm; it can be contrasted with the wikipedia example of the Babylonian method.



#--------*---------*---------*---------*---------*---------*---------*---------*
# Desc: Calculate square root of 125,348
#--------*---------*---------*---------*---------*---------*---------*---------*

S = 125348
u = 600 # rough estimation of an over-estimate

print('+', u)

for i in range(0,5):
if i % 2 == 0: # proess '+', an over-estimate
l = (2 * S * u) / (S + u * u)
print('-', l)
else: # proess '-', an under-estimate
u = .5 * (l + S/l)
print('+', u)


* OUTPUT *



+ 600
- 309.91700800250544
+ 357.186837334586
- 354.03137921119
+ 354.0451951246895
- 354.04519485512014









share|cite|improve this question











$endgroup$




Update: I'm changing the question to




What is the best way to exploit this idea?




(I tried but everything goes in a circle; you wind up implementing the Babylonian Method).





Let $S gt 0$.



If one 'runs' the Babylonian Method in the usual fashion, we get a decreasing sequence that converges to $sqrt S$. By using an inversion trick, we can get an increasing sequence with the same limit:



With seed $l_0 gt 0$, define



$$tag 1 l_{n+1} = frac{2, S, l_n}{S + l_n^{;2}}$$



I performed some googling trying to find this recursion, but came up empty handed. I would be surprised if it isn't already an algorithm known to numerical analysts.




What is the name of this technique?




We can combine (1) with the Babylonian method,



$$tag 2 u_{n+1} = frac{1}{2} , (u_n + frac{S}{u_n}) $$



to get convergence with built-in error bounds.



Here is a Python program implementing the proposed algorithm; it can be contrasted with the wikipedia example of the Babylonian method.



#--------*---------*---------*---------*---------*---------*---------*---------*
# Desc: Calculate square root of 125,348
#--------*---------*---------*---------*---------*---------*---------*---------*

S = 125348
u = 600 # rough estimation of an over-estimate

print('+', u)

for i in range(0,5):
if i % 2 == 0: # proess '+', an over-estimate
l = (2 * S * u) / (S + u * u)
print('-', l)
else: # proess '-', an under-estimate
u = .5 * (l + S/l)
print('+', u)


* OUTPUT *



+ 600
- 309.91700800250544
+ 357.186837334586
- 354.03137921119
+ 354.0451951246895
- 354.04519485512014






real-analysis numerical-methods






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 8 at 2:32







CopyPasteIt

















asked Jan 6 at 3:56









CopyPasteItCopyPasteIt

4,1531628




4,1531628








  • 2




    $begingroup$
    Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
    $endgroup$
    – LutzL
    Jan 6 at 9:23






  • 1




    $begingroup$
    There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
    $endgroup$
    – Martin Rosenau
    Jan 6 at 9:25










  • $begingroup$
    @MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
    $endgroup$
    – CopyPasteIt
    Jan 6 at 11:03






  • 1




    $begingroup$
    We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
    $endgroup$
    – achille hui
    Jan 8 at 2:54














  • 2




    $begingroup$
    Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
    $endgroup$
    – LutzL
    Jan 6 at 9:23






  • 1




    $begingroup$
    There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
    $endgroup$
    – Martin Rosenau
    Jan 6 at 9:25










  • $begingroup$
    @MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
    $endgroup$
    – CopyPasteIt
    Jan 6 at 11:03






  • 1




    $begingroup$
    We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
    $endgroup$
    – achille hui
    Jan 8 at 2:54








2




2




$begingroup$
Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
$endgroup$
– LutzL
Jan 6 at 9:23




$begingroup$
Usually the knowledge is sufficient that $l_n=S/u_n$ is a lower approximation, and the next upper approximation is the midpoint of lower and upper approximation.
$endgroup$
– LutzL
Jan 6 at 9:23




1




1




$begingroup$
There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
$endgroup$
– Martin Rosenau
Jan 6 at 9:25




$begingroup$
There are a lot of sequences which are known but do not have any specific name. Especially sequences which do not have any practical meaning normally won't be given some name.
$endgroup$
– Martin Rosenau
Jan 6 at 9:25












$begingroup$
@MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
$endgroup$
– CopyPasteIt
Jan 6 at 11:03




$begingroup$
@MartinRosenau I guess I can call it the Inverted Babylonian Method,,,
$endgroup$
– CopyPasteIt
Jan 6 at 11:03




1




1




$begingroup$
We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
$endgroup$
– achille hui
Jan 8 at 2:54




$begingroup$
We do know the closed from of $ell_n$, you can probably use it to construct an error bound you like: $$ell_{n} = sqrt{S}frac{1-epsilon^{2^n}}{1 + epsilon^{2^n}}quadtext{ with }quadepsilon = frac{sqrt{S}-ell_0}{sqrt{S}+ell_0}$$
$endgroup$
– achille hui
Jan 8 at 2:54










1 Answer
1






active

oldest

votes


















1












$begingroup$

If
$l_{n+1}
= frac{2, S, l_n}{S + l_n^{;2}}
$

then
$l_{n+1}
= frac{2}{frac{1}{l_n} + frac{l_n}{S}}
$

so
$frac1{l_{n+1}}
= frac{frac{1}{l_n} + frac{l_n}{S}}{2}
$
.



Letting
$a_n = frac1{l_n}$,
$a_{n+1}
= frac{a_n + frac{1}{Sa_n}}{2}
= frac{a_n + frac{1/S}{a_n}}{2}
$

and this is Newton's iteration for
$frac1{S}
$
.



So,
whatever direction Newton's converges,
this will converge in the opposite direction.






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063468%2finverting-the-babylonian-method-to-get-lower-approximations-of-a-square-root%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    If
    $l_{n+1}
    = frac{2, S, l_n}{S + l_n^{;2}}
    $

    then
    $l_{n+1}
    = frac{2}{frac{1}{l_n} + frac{l_n}{S}}
    $

    so
    $frac1{l_{n+1}}
    = frac{frac{1}{l_n} + frac{l_n}{S}}{2}
    $
    .



    Letting
    $a_n = frac1{l_n}$,
    $a_{n+1}
    = frac{a_n + frac{1}{Sa_n}}{2}
    = frac{a_n + frac{1/S}{a_n}}{2}
    $

    and this is Newton's iteration for
    $frac1{S}
    $
    .



    So,
    whatever direction Newton's converges,
    this will converge in the opposite direction.






    share|cite|improve this answer









    $endgroup$


















      1












      $begingroup$

      If
      $l_{n+1}
      = frac{2, S, l_n}{S + l_n^{;2}}
      $

      then
      $l_{n+1}
      = frac{2}{frac{1}{l_n} + frac{l_n}{S}}
      $

      so
      $frac1{l_{n+1}}
      = frac{frac{1}{l_n} + frac{l_n}{S}}{2}
      $
      .



      Letting
      $a_n = frac1{l_n}$,
      $a_{n+1}
      = frac{a_n + frac{1}{Sa_n}}{2}
      = frac{a_n + frac{1/S}{a_n}}{2}
      $

      and this is Newton's iteration for
      $frac1{S}
      $
      .



      So,
      whatever direction Newton's converges,
      this will converge in the opposite direction.






      share|cite|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        If
        $l_{n+1}
        = frac{2, S, l_n}{S + l_n^{;2}}
        $

        then
        $l_{n+1}
        = frac{2}{frac{1}{l_n} + frac{l_n}{S}}
        $

        so
        $frac1{l_{n+1}}
        = frac{frac{1}{l_n} + frac{l_n}{S}}{2}
        $
        .



        Letting
        $a_n = frac1{l_n}$,
        $a_{n+1}
        = frac{a_n + frac{1}{Sa_n}}{2}
        = frac{a_n + frac{1/S}{a_n}}{2}
        $

        and this is Newton's iteration for
        $frac1{S}
        $
        .



        So,
        whatever direction Newton's converges,
        this will converge in the opposite direction.






        share|cite|improve this answer









        $endgroup$



        If
        $l_{n+1}
        = frac{2, S, l_n}{S + l_n^{;2}}
        $

        then
        $l_{n+1}
        = frac{2}{frac{1}{l_n} + frac{l_n}{S}}
        $

        so
        $frac1{l_{n+1}}
        = frac{frac{1}{l_n} + frac{l_n}{S}}{2}
        $
        .



        Letting
        $a_n = frac1{l_n}$,
        $a_{n+1}
        = frac{a_n + frac{1}{Sa_n}}{2}
        = frac{a_n + frac{1/S}{a_n}}{2}
        $

        and this is Newton's iteration for
        $frac1{S}
        $
        .



        So,
        whatever direction Newton's converges,
        this will converge in the opposite direction.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 8 at 0:38









        marty cohenmarty cohen

        73.3k549128




        73.3k549128






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063468%2finverting-the-babylonian-method-to-get-lower-approximations-of-a-square-root%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            Npm cannot find a required file even through it is in the searched directory

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith