Last night an old idea came back to me, an idea about a function where all the derivatives start from zero and then grow smoothly. I thought it would be impossible, but then I found some interesting stuff on Wikipedia. So, I learned to use SymPy and wasted a lot of time with it. Here’s a report of my (non-)findings.

(UPDATE: I did some numerical differentiation, which showed that h(x) does have negative derivatives. See details in this comment. A disappointment, although perhaps not a surprising one. It doesn’t however, necessarily mean the goal is impossible.)

So, if anyone knows whether such a function exists and what it looks like, please tell me.

  • e0qdk@reddthat.com
    link
    fedilink
    arrow-up
    4
    ·
    20 days ago

    all the derivatives start from zero and then grow smoothly.

    I think you need to have a discontinuity in a derivative at some level to have a function like this where the lower derivatives grow smoothly. If you have zero at all levels and no discontinuities… nothing should ever change, right?

      • e0qdk@reddthat.com
        link
        fedilink
        arrow-up
        2
        ·
        20 days ago

        Thanks for the links. This is outside the area of math I usually deal with, but I agree it’s interesting. I think I understand what you’re asking for now, but I’ve hit my mental limit for today trying to work it out. Good luck in your search!

    • Kogasa@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      20 days ago

      f(x) = e(-1/x2) for x != 0, f(0) = 0. It’s relatively easy to show this is infinitely differentiable at x=0 and every derivative is 0.

      The intuition that an infinitely differentiable function can be described globally by its derivatives locally is actually true for complex differentiable functions, and this property is sometimes referred to as “rigidity” of complex-differentiable (or analytic/holomorphic) functions. It doesn’t hold for functions that are only differentiable along the real axis.

  • wabasso@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 days ago

    I won’t be able to help you, but was wondering if you could help me understand what tau is in the equations. I got lost when that showed up.

    • SurrealPartisanOP
      link
      fedilink
      arrow-up
      2
      ·
      20 days ago

      You can think of the convolution as a process to smooth the function g by making its values at points around each t affect that at t. So, tau is the distance between t and another point, and Psi(tau) tells how much the other point contributes to the smoothing at point t. In a more decent situation, the integral in (7) would have been properly solved and tau would have disappeared, never to bother us again.

  • WolfLink@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    20 days ago

    Could you approximate derivatives by finite differences?

    Could you write your own code implementing the the derivatives?

    • SurrealPartisanOP
      link
      fedilink
      arrow-up
      1
      ·
      20 days ago

      Could you approximate derivatives by finite differences?

      Yes. I will try that.

      Could you write your own code implementing the the derivatives?

      No, I don’t think I could.

      • SurrealPartisanOP
        link
        fedilink
        arrow-up
        1
        ·
        19 days ago

        I did some numerical differentiation, with ten thousand points between 0 and 10. Negative values appeared in the third derivative. The attached figure zooms into them. While I think those sudden spikes may very well be numerical artifacts caused by float rounding errors or something like that, there is a clear negative slope around them, further confirmed in the fourth derivative. So, this function is not what I hoped it to be.