10-29-2021, 03:42 AM
Hi,
I'm looking at performing live path corrections to better improve the accuracy of a robot in real time.
We would like the robot to perform machining, utilizing a laser tracker or similar vision based TCP state estimation system in attempt to compensate for deflection, etc.
Ultimately the goal is to also have the system recognize deflection and adjust in attempt to combat chatter.
From what I've gathered,
We can use the delta values, between where the robot thinks it is and it's observed position, to adjust the reference frame of move commands before they are called, however once a path is in motion it won't update if the reference frame moves.
I remember seeing an example where a robot was machining a long asymmetric, tiered, circular object rotating on a secondary machine, (sort of like automated lathing).
This gives me hope that such can be done, hopefully without too much difficulty.
Any advice?
I'm looking at performing live path corrections to better improve the accuracy of a robot in real time.
We would like the robot to perform machining, utilizing a laser tracker or similar vision based TCP state estimation system in attempt to compensate for deflection, etc.
Ultimately the goal is to also have the system recognize deflection and adjust in attempt to combat chatter.
From what I've gathered,
We can use the delta values, between where the robot thinks it is and it's observed position, to adjust the reference frame of move commands before they are called, however once a path is in motion it won't update if the reference frame moves.
I remember seeing an example where a robot was machining a long asymmetric, tiered, circular object rotating on a secondary machine, (sort of like automated lathing).
This gives me hope that such can be done, hopefully without too much difficulty.
Any advice?