Good morning,

Hope this is not a stupid question, I am very new to Blender. So, my setup is:

  • 3d env built from iPad photogrammetry

  • we insert some lasers (a simple cylinder with emission node)

  • we control the lasers using QLC+ --> artnet --> BlenderDMX and a python expression that modulates the emission color for every laser from a separate dmx channel.

We would now love to be able to store the dmx animation directly in blender as keyframes in order to export the animation and put it back on the iPad for AR simulation. Is there any way to record the driver data in real time?

  • Pennomi
    link
    121 days ago

    I do not believe that USDZ supports features that allow you to modify the color of the object or do any kind of driver-based logic. (It certainly did not two years ago, when I was in the AR industry). Blender can do it, but Reality Composer probably can’t.

    There are only three possible paths I think might be viable.

    1. If USDZ supports UV animation (unlikely?), you could use the driver to animate the UV coordinates of the cylinders across a rainbow gradient, based on what wavelength of light you need.
    2. If you have a small number of colors, you can have redundant cylinders for each color, and show/hide them in the animation to fake the color change.
    3. In Reality Composer, set up some hooks that swap the materials of the cylinders based on various triggers. I think you can use a timer as a trigger. This is a very manual process and would be an absolute nightmare.
    • @TDSOJohnOP
      link
      220 days ago

      Thanks for the detailed explanation, not the answer I hoped for but definitely the answer I needed! Will look into all 3 options, thanks again!