• @[email protected]
    link
    fedilink
    101 month ago

    Opinionated summary: Developers saw REST, picked the good parts and ignored the rest (no pun intended). They still called it REST, for lack of a better word, even though things like HATEOAS were overkill for most of the applications.

  • Fred
    link
    fedilink
    61 month ago

    Maybe I’m wildly misunderstanding something, not helped by the fact that I work very little with Web technologies, but…

    So, in a RESTful system, you should be able to enter the system through a single URL and, from that point on, all navigation and actions taken within the system should be entirely provided through self-describing hypermedia: through links and forms in HTML, for example. Beyond the entry point, in a proper RESTful system, the API client shouldn’t need any additional information about your API.

    This is the source of the incredible flexibility of RESTful systems: since all responses are self describing and encode all the currently available actions available there is no need to worry about, for example, versioning your API! In fact, you don’t even need to document it!

    If things change, the hypermedia responses change, and that’s it.

    It’s an incredibly flexible and innovative concept for building distributed systems.

    Does that mean only humans can interact with a REST system? But then it doesn’t really deserve the qualifier of “application programming interface”.

    • @[email protected]OP
      link
      fedilink
      English
      11 month ago

      No, it doesn’t mean only humans can interact with it.

      The key point [of classical REST] is that responses are self-contained self-describing. Requesting a resource response tells you what actions you can take on it. There is no need for application domain knowledge, implicitly or separately-explicitly shared knowledge.

      Some HTTP web apis offer links in their JSON responses for example. Like previous and next page/ref on paging/sectioning/cursor. Or links to other resources. I don’t think I’ve ever seen possible resource actions/operations be included though. Which is what the original REST would demand.

      That’s how I understood it anyway.

      Their suggestion of using HTML rather than JSON is mainly driven by their htmx approach, which the project and website is about. Throughout this article though, they always leave open which data form is actually used. In your quoted text they say “for example”. In a later example, they show how JSON with hyperlinks could look like. (But then you need knowledge about that generalized meta structure.)

    • @[email protected]
      link
      fedilink
      11 month ago

      It feels like he’s trying to say something like Swagger should always be required. One of the things about SOAP for example was that it always had a self-generating WSDL that you could consume to get everything. There were quite a few REST endpoints that were missing this when first developed.

      But I do agree that “forms” and “html” are quite the opposite of an API.

      • Fred
        link
        fedilink
        11 month ago

        Well I’m not missing the point then, that’s good to know :)

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    1 month ago

    I really struggle to see where HATEOAS can be used. Obviously not for machine to machine uses as others have pointed out. But even for humans it would lead to terrible interfaces.

    If the state of the resource changes such that the allowable actions available on that resource change (for example, if the account goes into overdraft) then the HTML response would change to show the new set of actions available.

    So if I’m in overdraft, some actions are not available? Which means they are not shown at all? How can a user easily know that there are things they could do, it it wasn’t for the fact that they are in a specific state? Instead of having disabled buttons and menus, with help text explaining why they are not usable, we just hide them? That can’t be right, can it? So how do we actually deliver a useable UX using HATEOAS?

    Or is it just meant for “exploration”, and real clients would not rely on the returned links? But how is that better than actual docs telling you the same but much more clearly ?

    • @[email protected]OP
      link
      fedilink
      English
      11 month ago

      I found the dropping of actions quite surprising as well. I would suspect we could return the links with a disabled attribute? If they should be displayed but not accessible/triggerable.

  • @[email protected]
    link
    fedilink
    English
    3
    edit-2
    1 month ago

    Why is he not mentioning restfulobjects? This is exactly what he describes, just encapsulated in JSON, not in HTML, and in a way that it can actually be automatically consumed as there are some guidelines how to structure the documents.

    We use it at work, and I don’t like it. Its overly complicated and adds a lot of overhead (at least in the way we implemented it). A simple HTTP+JSON RPC with a good URL structure and a OpenApi documentation would be easier to understand, and to consume.

    • @[email protected]OP
      link
      fedilink
      English
      2
      edit-2
      1 month ago

      Doesn’t help that it’s a multi-page document

      Persistent domain entity, Proto-persistent domain entity, View model, ,

      What the heck… Yeah, I wouldn’t want to use that either. While it may be a formalization, it seems like it would significantly increase complexity and overhead. That can’t be worth it unless it’s a huge enterprise system that has to work with generalized object types across teams or something.

      I hadn’t heard of Restful Objects before.

  • @[email protected]
    link
    fedilink
    English
    230 days ago

    How does HATEOAS deal with endpoints that take arguments? E.g. I have an endpoint that merges the currently viewed resource with another one? Does it require a new (argumentless) endpoint showing a form where one can enter the second resource? Wouldn’t it be quite inefficient if you have to now do two (or more) requests instead of just one?

  • The Bard in Green
    link
    fedilink
    English
    11 month ago

    As a security and DevOps engineer, HTMX has been such a pain in my butt lately.

    Something are broken? -> Web devs blame WAF -> Me debugs and researches for hours when I has better stuff to do -> Finally me: WAF is fine. Is your broken JavaScript. Wut do? -> Web devs: Not know, write in HTMX, JS is abstracted, now we fix. -> 15 minutes later web devs: We fix! We do basic thing wrong! Now learn something new about HTMX. -> Me: Great. Thanks so much for that.

    • @[email protected]OP
      link
      fedilink
      English
      1
      edit-2
      1 month ago

      I didn’t quite follow.

      They’re using htmx, make errors, and learning something new about using it?

      That’s like using any new tech though, right? Or - depending on the devs - happens even with established tech.

      I’ve never seen htmx in production. I find it interesting though and want to explore using it. That won’t be at work though. :)

      • The Bard in Green
        link
        fedilink
        English
        1
        edit-2
        30 days ago

        It’s not HTMX (well kinda, see below). It’s devs being like “These bugs aren’t my fault until you prove they are.”

        And it’s abstraction. I understand WHY we do it… I guess. I don’t AGREE, and this is why. If you’re playing with code that generates code that you don’t understand and then can’t debug… I’ve always preferred to write the code that’s going to run. Don’t give me APIs that interact with database, let me write SQL. Don’t give me something with a “simple syntax” that behind the scenes generates a bunch of JavaScript doing who knows what that no one on the team understands (HTMX), it’s gonna be easier on everybody to just write the JavaScript. The only reason the devs solved it was ChatGPT knew what they had missed… and I had to point them in that direction (I cut and pasted the errors into ChatGPT and then said “Guys, IDK what this means because I don’t know anything about HTMX. Do YOU GUYS KNOW?”)

        I ended up spending hours of my life on some tech some hotshots wanted to use because it’s the “new, sexy thing” and what I learned was “Oh, this is just abstracted syntax ontop of Ajax which is ALREADY abstracted syntax ontop of JavaScript.” I find this annoying, not charming.