Replies: 4 comments
-
Hi, if you put the ultrasound in a flat-bottom water container, then you will see the bottom as a line in the ultrasound image. If you move the ultrasound up-down, that line will be going down-up in your image. If you visualize the transducer motion based on tracking, that will also go up-down. You can use the difference direction change timepoints in the two modalities to measure time offset. This is implemented in the fCal application of PLUS. But it's also easy to measure it in Slicer if you stream the data to Slicer via OpenIGTLink. |
Beta Was this translation helpful? Give feedback.
-
@ungi answered the question well. I would just add that you can expect more consistent, more accurate temporal synchronization if you run a single PlusServer instance. You can file example configuration file for joint tracking and imaging data acquisition and automatic temporal calibration (and more information about the calibration process) in the temporal calibration algorithm documentation. |
Beta Was this translation helpful? Give feedback.
-
Thank you both, I'll give both your suggestions a go for synchronization and a single server for both devices. I guess my next question is how do I actually record US video and the tracking data? I see that the IGT module offers the 'Plus Remote', but I'm failing to see how I can record the data in a usable format for us (video file and coordinate data). I suppose I'm a little worried that I'm going down the rabbit hole with Slicer, as this is very new to me... In reality, I just need to give a local timestamp to our US video and our motion capture coordinate data which we can then use the above synchronization suggestion to figure out the offset. I thought that this would likely be doable given that Plus is streaming both of the device's data. Sorry for the confusion and thank you again for your time! |
Beta Was this translation helpful? Give feedback.
-
For temporal synchronization, I think it's better to record the data in PLUS, not Slicer. Because PLUS is closer to the hardware in the data streaming pipeline, so you have less chance of delays and loss of data. The fCal app can record data to sequence metafiles. And you may use the PLUS Remote module in the SlicerIGT extension (in Slicer) to trigger a capture device in a PLUS server. Sequence metafiles can be loaded in Slicer for analysis. Unless you use fCal for computing the time offset. Truth be told, I usually do the temporal synchronization the following way:
But some people may not agree, because this is a very manual step-by-step process. It's probably better to use fCal where this is more automated. I just like to do things the hard way so I learn from my mistakes :) |
Beta Was this translation helpful? Give feedback.
-
Hi all,
I'm a new Plus user looking to temporally synchronize our Telemed Micrus video output and NDI Polaris Vega ST output (measuring in vivo tendon biomechanics). Since the Vega lacks the ability for external triggering, it had been suggested that we use Plus to stream our data which would allow us to get timestamps for each stream of data. We've successfully made servers for both devices and connected them to Slicer, however we are failing to find the correct functions to make this all work. Does anyone have any feedback or suggestions of how this could be done?
Thanks a bunch!
Beta Was this translation helpful? Give feedback.
All reactions