• Aesthesiaphilia
    link
    fedilink
    3411 months ago

    I mean you’ve just translated from a language most people don’t speak to a different language most people don’t speak

    • @Zeth0s
      link
      10
      edit-2
      11 months ago

      A simpler language many people know (math) to one of the imfinite dialect of a language most people don’t speak.

      Left representation is definitely more readable and understanded by more people

      • @beefcat
        link
        0
        edit-2
        11 months ago

        I don’t know about that, I know a lot of successful programmers who never took calculus.

        The barrier to entry for programming is considerably lower today than it was even 15 years ago. Lots of kids, myself included back in the day, were learning basic control flow in languages like C, Python, or JavaScript long before taking advanced math courses in high school.

        • @Zeth0s
          link
          2
          edit-2
          11 months ago

          Where I grow up sum at least is thought in all high school. Final exam in many high school (mine included) must have at least exercises on integrals, that are just infinitesimal sums.

          If one went to high schools, 90% they know these symbols. Very few of them can program.

          Programming doesn’t require math, but scientific computing, algorithms and hpc do require understanding of linear algebra, as computers “think” in linear algebra

          • @beefcat
            link
            1
            edit-2
            11 months ago

            It was never required in my school district, where the minimum requirement was Algebra 2.

            But the popularity of this post kind of proves my point. There are a lot of programmers out there who readily understood the for loops on the right, but not the sigma notation on the left. Pretending their experience is invalid cuts us off from a potential avenue to help more people understand these concepts.

      • @[email protected]
        link
        fedilink
        -1
        edit-2
        11 months ago

        Left representation is definitely more readable

        Hard disagree. The right can be read linearly. You know, the way humans read.

        I sucked balls at precalc, but I’m pretty decent at programming. I suppose, with enough practice, one becomes “fluent” in mathematical notation, but the C-style language definitely reads more naturally. The mathematical notation is what I’d call “too much abstraction.”

        and understanded by more people

        I don’t know the stats, but I have to imagine, by this point, there are more programmers than mathematicians.

        • @Zeth0s
          link
          2
          edit-2
          11 months ago

          Sum and product are high school curriculum in many countries. Where I grew up sum symbol is curriculum in all high schools including trade schools.

          Regarding readability, this case is just the definition… Problem of for loops is that they become unreadable very quickly, so quickly that most of the modern languages focused on readability discourage use of for loop exactly for readability, replaced by list comprehension or map. Once you have a real world case, sum sign become incredibile more readable. That is the reason why the meme is not how one implement a sum in real world program. The corresponding in a modern, readable language is something like

          sum(x) # x is a list or generator
          prod(x) 
          

          that is the mathematical notation

  • @[email protected]
    link
    fedilink
    English
    21
    edit-2
    11 months ago

    Part of what’s going on here is that math notation is … not good. Not for understanding, readability or explanation. Add in the prestige that surrounds being “good at math” and “being able to read that stuff?” and you get an unhealthy amount of gate keeping.

    Whenever I’ve been able to find someone breakdown a set of equations into computer code has been a wonderful clarifying experience. And I think it goes beyond just being better at code or something. Computer code, more often, is less forgiving about what exactly is going on in the system. Maths, IME, often leaves some ambiguity or makes some presumption in the style of “oh, of course you’d need to do that”. While if you going to write a program, it all needs to be there, explicitly.

    I recommend Brett Victor’s stuff on this: Kill Math

    • @Zeth0s
      link
      English
      12
      edit-2
      11 months ago

      That’s absolutely the opposite for me. Math language is extremely good in summarizing extremely complex logic in few lines. We have huge ML projects with a looot of logic, that can be summarized with either 10 lines of math or 100 lines on English overwhelming cognitive complex.

      Math is the best language we have for logic.

      This meme is the proof, left representation is more concise and clearer than the for loop, and therefore allows for easily represent much more complex logic, while for loops become quickly unreadable (map and reduce are for instance more readable)

      • @[email protected]
        link
        fedilink
        English
        011 months ago

        This meme is the proof, left representation is more concise and clearer than the for loop

        Except you know what the left side is saying already. That’s a language and a phrase in that language you know well. What about people who don’t?

        • @Zeth0s
          link
          English
          5
          edit-2
          11 months ago

          And what about people who don’t understand a for loop? Who is the strong majority? You are underestimating the complexity of programming language specific syntax. What is for, i++, {}, *= for uninitiated?

          • @[email protected]
            link
            fedilink
            English
            211 months ago

            Code is something you can play with in a variety of environments and get an intuition for through experimentation. Its a system you can inspect and manipulate and explore yourself. It’s also a language which you can use yourself and create your own abstractions incrementally if helpful.

            Ideally better syntax would be used for learning, a Python style rather than C style for loop for example, in which case you’re starting to get toward natural language.

            In the end though, I’m not putting programming languages on a pedestal here. I’m saying mathematical notation ought not be thought of as the only way to express and understand mathematical ideas, especially for the purposes of learning, on which some of the points I made just above are relevant as is the link I provided further up. Whatever efficiency it can bring, it also brings opacity and it’s inevitably useful to be prepared to provide other expressions of the equations or systems.

          • @beefcat
            link
            English
            0
            edit-2
            11 months ago

            The difference is, a for loop is one of the first things any new programmer learns. Anybody with any programming experience can understand the examples on the right, as they follow the same C-like syntax used by a majority of modern programming languages. Kids used to figure this stuff out just learning to customize their MySpace pages.

            Few people learn what the symbols on the left mean before they start taking higher math courses in high school, nearly 10 years into their formal math education.

            This isn’t to say one way is better than the other, both examples are built for different use-cases. But I wouldn’t be surprised if in 2023, there are more poeple alive who understand the for loops than sigma notation.

            • @Zeth0s
              link
              English
              2
              edit-2
              11 months ago

              Few people? They are high school level where I grew up. Literally everyone with high school diploma must understand at least the sum. In many high schools the final math exam must include at least one integral, that is the infinitesimal sum.

              Programming on the other hand isn’t thought in most schools

              • @beefcat
                link
                English
                111 months ago

                Programming is taught in cheap educational microcontroller kits aimed at 12 year olds. You can find them in the STEM section of just about any toy store. This idea that few people ever learn to code before calculus seems crazy to me, most of my peers were at least writing simple scripts by middle school. This is because programming is much more easily self-taught than other STEM subjects, and can be used to solve more immediate everyday problems kids who grew up with computers might seek to solve.

                I’m not saying everyone learns to code before they learn higher math. I am saying that you shouldn’t be surprised that the comparisons in the OP have proven popular and insightful for a lot of people, because there are a lot of us who learned to code first.

                They are high school level where I grew up. Literally everyone with high school diploma must understand at least the sum.

                My school district in Utah did not require any math credits beyond Algebra 2 at the time I graduated. trig and calculus were classes I took because I wanted to. But Utah’s STEM requirements are woefully inadequate in my book.

        • Kogasa
          link
          fedilink
          English
          311 months ago

          It would really, really suck if we had to do math with for loops instead of sigma notation. It’s egregiously common. It’s also just not that hard to learn. For example, you can look at this meme and pretty much know how to read it.

    • @[email protected]
      link
      fedilink
      English
      811 months ago

      It’s funny, with the increase in use of numerical models, so much math has been turned into computer code. Derivatives and integrals as well are defined by finite difference formulas that serve as the basis for the notations. The point of them isn’t to explain, it’s just to simplify writing and reading it. I agree it can be a bit obtuse but if you had to write out a for loop to solve a math equation every time it would take forever lol

      • @[email protected]
        link
        fedilink
        English
        111 months ago

        Well this is where the computing perspective comes in.

        Programming culture has generally learnt over time that the ability to read code is important and that the speed/convenience of writing ought to be traded off, to some extent, for readability. Opinions will vary from programmer to programmer and paradigm/language etc. But the idea is still there, even for a system whose purpose is to run on a computer and work.

        In the case of mathematical notation, how much is maths read for the purposes of learning and understanding? Quite a lot I’d say. So why not write it out as a for loop for a text/book/paper that is going to be read my many people potentially many times?!

        If mathematicians etc need a quick short hand, I think human history has shown that short hands are easily invented when needed and that we ought not worry about such a thing … it will come when needed.

        • @Zeth0s
          link
          English
          5
          edit-2
          11 months ago

          Actually programs are much less readable than corresponding math representation. Even in a simpler example of a for loop. Code is known to quickly add cognitive complexity, while math language manage to keep complexity understandable.

          Have you tried reading how a matrix matrix multiplication is implemented with for loops? Compare it with the mathematical representation to see what I mean

          Success of fortran, mathematica, R numpy, pandas and even functional programming is because they are built to make programming closer to the simplicity of math

          • @[email protected]
            link
            fedilink
            English
            111 months ago

            Will I think there’s a danger here to conflate abstraction with mathematical notation. Code, whether Fortran, C or numpy, is capable of abstraction just as mathematics is. Abstraction can help bring complexity under control. But what happens when you need to understand that complexity because you haven’t learnt it yet?

            Now sure writing a program that will actually work and perform well adds an extra cognitive load. But I’m talking more about procedural pseudo code being written for the purposes of explaining to toss who don’t already understand.

            • @Zeth0s
              link
              English
              2
              edit-2
              11 months ago

              Math is the language developed exactly for that, to be an unambiguous, standard way to represent extremely complex, abstract concepts.

              In the example above, both the summation and the for loop are simply

              a_1 + a_2 + ... + a_n
              

              Math is the language to explain, programming languages is to implement it in a way that can be done by computers. In a real case scenario is more often

              sum(x)
              

              or

              x.sum()
              

              as a for loop is less readable (and often unoptimized).

              If someone doesn’t know math he can do the same as those who don’t know programming: learn it.

              Learning barrier of math is actually lower than programming

        • Kogasa
          link
          fedilink
          English
          411 months ago

          Using for loops instead of sigma notation would be almost universally awful for readability.

    • TroyOP
      link
      fedilink
      English
      511 months ago

      I agree. Mathematical notation is often terribly opaque. And sometimes outright broken. Why the hell is it sin²(x)? Any reasonable programmer will tell you that this syntax will only lead to trouble. ;)

      • @Zeth0s
        link
        English
        311 months ago

        What’s wrong with sin^2(x)?

        • Kogasa
          link
          fedilink
          English
          411 months ago

          Putting an exponent on a function symbol like that usually means either a typical exponential/power, except when it’s -1, in which case it’s a functional inverse. sin^(-1)(x) is the functional inverse of sin(x), which is not the same as the reciprocal (sin(x))^(-1). Some people even use sin^(a)(x) where a is an integer to denote functional composition, so sin^(2)(x) = sin(sin(x)).

          Besides that pretty major issue, nothing.

    • TroyOP
      link
      fedilink
      311 months ago

      Haha, touché. I just used the xpost function on the original though. I hold myself blameless.

      • @shotgun_crab
        link
        4
        edit-2
        11 months ago

        You can edit it I think (not sure if it works for crossposts)

  • kamen
    link
    16
    edit-2
    11 months ago

    Yeah, cool, except that the first time you encounter these (probably in high school) you’d be a minority if you somehow already know programming.

    Edit: and if you somehow already know programming, chances are you’ve encountered some math in the process.

    • @beefcat
      link
      811 months ago

      I learned basic programming skills around the time I was taking algebra in middle school. This was in the '00s.

      For me, code was a lot easier to understand and going forward I would write programs that implemented the concepts I was learning in math classes in order to better comprehend them (and make my homework easier). I demonstrated enough aptitude here that I was allowed to take two years of AP Computer Science in high school despite lacking the math prerequisites.

      I know a lot of programmers who think they are “bad at math” but really, they struggle with mathematical notation. I think a big reason for this disconnect is that mathematical notation prioritizes density, while modern programming languages and styles prioritize readability.

      These different priorities make sense, since math historically needed to be fast to write in a limited amount of space. Mathematicians use a lot of old Greek symbols, and single-letter variable identifiers. The learning curve and cognitive load associated with these features is high, but once mastered you can quickly express your complex idea on a single chalkboard.

      In programming, we don’t need to fit everything on a chalkboard. Modern IDEs make wrangling verbose identifiers trivial. The programming languages themselves make use of plain English words rather than arcane Greek letters. This results in code that, when well written, can often be somewhat understood even by lay people

  • @someguy3
    link
    911 months ago

    Maybe it’s the order that you learn it in. For me the left side is the easy to read and understand one.

  • @hark
    link
    611 months ago

    Single-letter constant/variable names are strongly discouraged in programming but standard in math.

    • @StarManta
      link
      611 months ago

      Math standard practices were created at a time when everyone was doing them by hand. Absolutely no one would write out “coefficient of gravity” or whatever 20 times by hand while trying to solve a physics equation.

      Single letter variable names were common in early programming for basically the same reason, only with typing.

      Ever since the proliferation of autocomplete and intellisense in programming IDE’s, typing a 4-word-long variable name has become a few key letters and then hitting tab. Ever since then, code readability has trumped the desire to type out fewer letters.

    • Kogasa
      link
      fedilink
      211 months ago

      Complicated math generally contains a lot more explicit definitions of the variables involved, either in English or with previously established notation. Writing proofs is more about communicating the result than it is proving it. In that sense it is similar to programming with an emphasis on maintainability.

      • @beefcat
        link
        011 months ago

        Sure, the variables have explicit definitions somewhere, but it still requires you to go back and reference them every time you forget what y stood for.

        With more verbose identifiers like in code, you don’t need these reminders. The cognitive load is reduced, because you no longer need to hold a table in your head that correlates these random letters with their definitions.

        • Kogasa
          link
          fedilink
          111 months ago

          I assure you the cognitive load would not be reduced. It would just be less readable.

  • @MossBear
    link
    511 months ago

    I mean Freya Holmer is a pretty great teacher, so not surprising. I learned vector math watching her videos.

  • Thalamus
    link
    411 months ago

    One of my math teachers explained it exactly like this. ‘For the people who know how to program: this is the same as using a for loop’.

  • @[email protected]
    link
    fedilink
    411 months ago

    Math is a language, code is instruction. The language of math is geared toward efficiency and viability for abstractions a layer higher (ad infinitum). Once you are familiar with the language, the symbols take a life of their own, and their manipulation becomes almost mechnical, which reduces the cognitive cost for your mind to operate at the current level of abstraction, so you can focus your mental power on the next level of abstraction, and try to figure out something novel. You can of course unpack the compact language of math into the more plain – in a sense more “flat” – form of code instructions; the purpose is different: its more about implementing ideas than creating the ideas in the first place.

    • @AlataOrange
      link
      511 months ago

      It’s honestly not that useful I’ve only ever seen it in high level statistics

      • mohKohn
        link
        fedilink
        111 months ago

        They come up in complex analysis bc writing polynomials in terms of their roots is sometimes useful.

    • Kogasa
      link
      fedilink
      111 months ago

      Convergence issues aside, you can get from a product to a sum by taking logarithms. This is often a feasible way to reason about them / prove results about them.

  • @beefcat
    link
    411 months ago

    I love this!

    I struggled with higher math in high school until I started learning how to code. I was lucky and had math teachers that encouraged me to learn this way.

    I would love to see a full calculus course that teaches you in code before teaching you the proper notation.

  • @Hazdaz
    link
    311 months ago

    One of the worst things is a teacher who knows his material so well that he can’t dumb it down enough to explain it to someone who literally has never seen that notation ever before.

    A teacher without empathy is a terrible, awful thing that can turn students off so fast.

    I went to get my degree later in life and I would butt heads with this one particular math teacher all the time who admitted was extremely intelligent, but the entire class was lost because he could just would not break from his predetermined notes and lesson plans. Everyone else in that class was 18, 19 or 20 years old and too naive or timid to voice their concerns. I was considerably older and paying dearly for these classes, so you better believe I refused to just let issues slide. I’m sure some teachers would think I was a nightmare student, but I wasn’t trying to be disruptive - I was simply trying to learn and this guy was just bad at it with 3/4 of the class dropping out eventually.

  • TroyOP
    link
    fedilink
    English
    311 months ago

    Meta: I love this thread. It gives me hope that Lemmy has the critical mass required already. I can imagine this discussion taking place in r/math, and there being many times more comments, but the substantial points are all hit here. :)

  • @nodimetotie
    link
    211 months ago

    tbh, I am not sure what’s more scary, the LHS or the RHS