-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenIGTLinkTracker interpolation error #967
Comments
Is the OpenIGTLinkTracker sender on the same computer? If it is on a different computer then you either need to synchronize the clocks or use the received timestamps (instead of the timestamps embedded in the OpenIGTLink messages). |
The 3D Slicer application is running on the same computer as the PlusServer.exe |
Sending transforms from Slicer (application layer) to Plus (low-level data collection layer) is highly unusual. Data is expected to travel from low-level hardware/data acquisition layers to application/processing layer. Can you tell about your overall goal and how do you plan to achieve it? |
We have users checking checkboxes in Slicer to indicate the orientation of their transducer (#964). Users can then click a capture button and we need the image saved to disk in the correct orientation. |
MF/MN/UF/UN never changes for a transducer (it indicates the physical location of the marker on the transducer), so it should not change. If you use a framegrabber and the user may change the image flip when he sets up the system or changes transducer but that should be a very rare event, and in that case you can change the device set configuration in Plus and restart the server (you can do this from Plus, using Plus Remote module). If you only need to modify a transform in the transform repository in Plus then you don't even have to restart the server, using the If you only need to modify how the image is displayed in the application (show the marked side of the transducer left/right, show the the transducer surface at the top/bottom) then you must not change the device set configuration, it is just a display setting that you can change either by adjusting Volume Reslice Driver module settings or modify the driver transform (e.g., add a transform below the ImageToReference transform and use that as the driver transform). |
What we have is a device that holds the transducer and allows it to be rotated 180 degrees upon the elevational axis, or 180 degrees upon the axial axis, but there is no physical tracker (no encoders or camera), so we just have checkboxes in Slicer that say "Flip U/D" and "Flip L/R". We already flip the US video being streamed into Slicer using a vtkMRMLLinearTransformNode but we need the flips to be saved to disk as well upon recording an image or video clip. |
In this case, using However, I would recommend to attach an orientation sensor (such as the PhidgetSpatial 3/3/3, costs about $100, Plus supports it) to the probe and get the accurate angle automatically in real-time. You can use it to reconstruct good-quality volumes by tilting or spinning the probe. |
I started a PlusServer with the config here (http://perk-software.cs.queensu.ca/plus/doc/nightly/user/DeviceMicrosoftMediaFoundation.html) and updated the ImageToReference transform with OpenIGTLLinkRemote and the location of the video stream volume changed in Slicer coordinates as expected. However, after recording to an mha with Plus Remote, no transform data existed in the file. |
The static transforms in the transform repository are saved in the device set configuration file. You can save the configuration file after each transform change. Again, for your use case, an orientation sensor would be an appropriate solution. Plus has been used for volume reconstruction with such sensor successfully in several projects, by various groups, with both freehand rotation and with devices that constrain probe rotation to one or two rotation axes. Considering the cost of an ultrasound imaging system (at least a few thousand $) and development cost, the orientation sensor device cost is negligible. The device is small and it just needs a USB connection. |
To reproduce:
UseReceivedTimestamps ="FALSE"
andUseLastTransformsOnReceiveTimeout ="TRUE"
. Also create a video device (e.g. MmfVideo) and input the tracker and video into a VirtualMixer. Have the VirtualMixer output in an PlusOpenIGTLinkServer.This error will occur when changing the transform:
|ERROR|054.097000| TrackerDevice-USToReference: vtkPlusBuffer: Cannot perform interpolation, time difference compared to itemB is too big 0.868929 ( itemBtime: 53.194000, requested time: 54.062929).| in E:\D\PSNP64b\PlusLib\src\PlusDataCollection\vtkPlusBuffer.cxx(1183)
Scripts to do this in 3D Slicer:
Run this is Python Interactor:
Then run PlusServer with this DeviceConfig:
OpenIGTLinkConfigSimple.zip
The text was updated successfully, but these errors were encountered: