• MrScottyTay
    link
    fedilink
    English
    11 year ago

    I didn’t open the book, someone else looked into the book and wrote it down for me to then read when needed, just like how someone would put in the data for a program to look it up when asked.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          No, you are hung up on trying to read the book without actually reading it.

          That breaks the puzzle, since the device would not be able to anslyze the inside of an item of food from a pucture of the inside, and can only use highly generic data based on what it can assume from an image of the outside

          • MrScottyTay
            link
            fedilink
            English
            21 year ago

            Re-read the first one I sent.

            You can get a pretty good generalisation if you know what the food is. How do you think current apps for tracking nutrition work? All that this will do is just try and figure out what the food is from the picture rather than the user typing it in. Most foods you can tell what it is without “looking inside”. I’m pretty sure there’s apps that do that now, this isn’t something new and groundbreaking.

            And for nutrition you don’t need to be 100% exact when tracking it. Because you can’t be 100% even if you do know exact ingredients and how much of each one. Everything always has a variance. This method doesn’t need to be perfect for it to meet the needs of most that will use it.

            • @[email protected]
              link
              fedilink
              English
              21 year ago

              I agree that you can get a generic value of nutrition from a photo of a simple, fruit or vegetable, but since a pie/cake contains soo much stuff that looks identical to other stuff, rendering any photographic analysis useless.

              So yes, you can get some idea of the nutrition of some foods, but way too low to be useful.