Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Best practice deployment to robot

#1
Hi all,

what is the best practice deploying the program I created offline to the real robot?

I don't mean the process of transfering the program to the controller but adjusting all the frames and positions so that the offline model resembles the real physical model.

1. Do you measure the frames and TCPs with the robot controllers internal tools and transfer it to the RoboDK project?
2. Do you measure the frames and TCPs using RoboDKs measurement tool when connected online?
3. Do you use two frames per station? I don't quite get accurate results after measuring the frame online, so I always add a second offset frame with offsets in the range of sometimes 1 to 2 mm depending on the station. How do you deal with this problem?

Any other tips?

Best regards
Maxim
#2
The best practice highly depends on your project, application and controller. With RoboDK calibration tools you can define tools (TCPs) and reference frames. This is more or less the same operations you could perform with some robot controllers to define your tools and coordinate systems.

If you are concerned about accuracy, the defining the tool in RoboDK allows you to enter more than the standard 4 points and you can get a more accurate definition of your TCP. You can also get a better understanding about errors you can expect with your system.

如果你定义你的工具和坐标系统with RoboDK (using the Define tool/reference frame option under Utilities, using a product such as TwinTool to self calibrate the TCP or TwinTrack to define your coordinate system, using custom probing, using the 3D models), the default program export should have the correct definition of your tool and reference frame by default.

If you define your tools and coordinate systems using the robot controller you can simply copy their definitions in RoboDK (XYZABC) and they'll be exported the same way you have them defined in your robot controller. Also, with some controllers it is common practice to use indexed tools and frames (such as BASE_DATA[id] / TOOL_DATA[id] for KUKA, UFRAME/UTOOL for Fanuc, etc). If you prefer using the references and tools defined in your robot controller you can simply rename your tools and references in RoboDK to end with a number (example Tool 2). This number is passed to the post processor and this index is used by default with some post processors. This way you will always use the latest definition of your tool and coordinate system of your controller.

You can use as many frames as you need. One frame is the minimum unless you want to export your targets with respect to the robot base. Having 2 frames like you mentioned allows you to adjust offsets like you mentioned. On the other hand, it is normal to see 1-2 mm errors with robots unless you pay close attention on how you calibrate your tool and coordinate system.

It is very important to define your coordinate system after you properly calibrate your tool. If you can share more information about what process you followed to see that 1-2 mm error we can better help you troubleshoot to improve accuracy. For example, I would recommend defining your tool using 10 points using RoboDK and seeing what error statistics you get.
#3
(09-12-2022, 08:06 AM)Albert Wrote:If you can share more information about what process you followed to see that 1-2 mm error we can better help you troubleshoot to improve accuracy.

Dear Albert,
thank you very much for the detailed answer.
I finally started a project using RoboDK so I can show you what my troubles are.

My first problem is with machining projects. The robot is a KUKA KR10 R1100-2.
I have this turn table:


At first I measured the TCP with RoboDK with 10 points on the plane of the turntable. I added the points to this post. The errors:


It seems like the errors are too small to be real, but I measured the TCP twice and got the exactly same TCP, though the errors were bigger the first time.

After that I measured the turn table with RoboDK and the tool. I added the points aswell. The errors:


When I move the robot to the zero point it seems pretty accurate:



Attached Files
.txt tool_measured.txt(Size: 711 bytes / Downloads: 83)
.txt turn_table_measured.txt(Size: 430 bytes / Downloads: 88)
#4
When I execute the machining project in RoboDK the robot moves accurately along the curve:


But on the real robot it looks like this, it is off by around 1.5 mm (I added an offset of 4mm to the tool peak in Z, to not scratch on the surface of the part):


I understand that the error is probably caused by the robot tolerances. I measured the TCP and reference frame in another tool orientation. My question is how to best deal with it? Have I done something wrong or would you simply add a offset frame? But then it would mean that I always have to adjust my programs after creating them with RoboDK when I use a slightly different tool orientation. For example when I decide to move the turn table by 90 degrees instead of 0 degrees it would mean that I have to adjust the position again. Thats something I would like to avoid.
Is there a way around it without calibrating the robot with RoboDK?

My second problem is with pick and place, tough I think I found a good solution to this. I watched Jeremys modules on youtube but I am still unsure how to best connect the simulation with the real robot.
For example inModule 03-07organize boxes. In a real robot application where accuracy matters (withATI tool changersfor example) every box would have a slightly different position on the real robot as compared to the simulation, even with accurately calibrated TCPs and reference frames. How would you approach something like this?
我的解决方案now is to create a reference frame for every box like Jeremy did. In that frame I have the approach and retract targets. But I also put a "zero" pick target in it. Than I move the real robot to the pick target and adjust the position manually. After I found the position I overwrite the reference frame position of the box in RoboDK with the current position of the robot. As the box 3d model is inside the reference frame, the model also moves to the real position in space. So I kind of use the reference frame as a target frame.



Is there maybe a better way?

I really love Jeremys RoboDK Pro Modules and I learned a lot from them. But I would also like to see a module where the process is shown to deploy the offline created program to a real robot. I don't mean how to transfer a program to the controller or something like that. But some best practices on what to do, when after thetoolandreferenceframe calibration the position of the robot is still off by 1-2 mm of the real target.

I also want to mention: RoboDK and the whole team behind it is awesome! I really appreciate all your work and time you put in creating a great product, documentation, tutorials and answering all the questions.
#5
Let me comment regarding TCP calibration and the small errors you obtained:
If you took the 10 points by only moving around an existing TCP and not adjusting the XYZ position to match the pointer, it is normal that you see such small errors (under 0.001 mm). This happens because you are numerically using a perfect TCP and you did not correct any points. Therefore, you numerically obtain the same TCP you previously had.

To properly calibrate the TCP, even if you already have a good estimate, you should adjust the XYZ position as much as possible. You should then see more realistic robot accuracy/repeatability errors. These errors should be in the order of 0.010-1.000 mm (depending on the robot and the setup).
#6
Let me share my experience in transferring programs from CAM to a robot. These are not rules, but little tricks that can make solving problems a lot easier.

0. Transferring coordinates to CAM is much easier than moving something in the real world. I understand that this is obvious, but sometimes such a question arises. Try not to move/rotate the robot's virtual base coordinates.

1. Zeroing the robot: you can do this by the marks on the axes, but some robots come with a document that shows the values of the absolute encoders when setting the robot to zero when it was tuned at the factory calibration station. And if the robot is not physically damaged and has not been rebuilt after the factory, the best reset would be to bind the zeros of the joints to these values.

In this case, the mathematical kinematic model of the robot will more closely match the real one.

2. When you train a tool on a robot and you need accuracy in a certain area - tune the tool in this area in the plane in which you want to work. This way you can set up multiple tools for multiple workspaces. When setting up a tool, Cartesian coordinates are usually accurate and problems arise with tool OAT angle values. Therefore, this technique helps to get good positioning accuracy, provided that you do not change the orientation of the tool much in the area in which it was trained. Also, solutions like our TwinTool allow you to fine-tune the tool.

3. If you need to correctly position something in CAD / CAM, then you can measure the position of this object by the robot itself if it is correctly zeroed and the tool is trained. You just need to measure 3 points for any plane in which you want to work

4. Calibration of external axes is a challenge every time, especially if it is a 2 or 3-axis positioner and you need full cooperation. To set up these functions using a robot controller, the accuracy of the robot calibration and the accuracy of the TCP are extremely important.Positioner rotation axis coordinates can be found in the controller and transferred to RoboDK.

5. If you have access to a laser tracker, use it. In calibration tasks, the tracker is very useful and allows you to do many things faster and more accurately.
#7
Thanks to you both for answering!

I tried calibrating the TCP on two different planes with 30-35 points and with the tip-to-tip method with 11 points with a new KR10 R1100-2 robot, zeroing was checked with KUKAs EMD:
1. Plane 1300 mm x 800 mm: [ -0.780005, -0.587233, 298.278491, 0.000000, 0.000000, 0.000000 ], Errors: mean 0.119, std 0.08, max 0.33
2. Plane 200 mm x 150 mm: [ -0.911481, -0.407068, 298.067150, 0.000000, 0.000000, 0.000000 ], Errors: mean 0.148, std 0.148, max 0.546
3. Tip-to-tip: [ -1.351774, -0.330700, 299.312497, 0.000000, 0.000000, 0.000000 ], Errors: mean 0.353, std 0.176, max 0.628

So 1 and 2 are kind of similar but 3 is "way" off. The interesting part is that the Z value should be correct with method 3. The tool length is hard to measure but should be around 299.4 mm. So is the "Calib XYZ by plane" method less accurate? Both planes are milled, I used the whole plane with huge differences in XY and I used a gauge block, so the planar deviation should be minor. The tool orientation was the same with all 3 methods.

I checked the results of all 3 calibrations by rotating the TCP around a tip in A, B and C and none was very good, I guess around +-1 mm.


Will TwinTool gererate more reproducible/accurate results? Is it possible to get a trial license for a week?


Attached Files
.txt tools.txt(Size: 5.3 KB / Downloads: 87)
#8
Quote: I checked the results of all 3 calibrations by rotating the TCP around a tip in A, B and C and none was very good, I guess around +-1 mm.

Are you making rotations using a teach pendant?

Try please to write a simple program with OAT rotations around the tip ( try to hold the deadman switch all the time while you record points to avoid 'backrushes' when you release the trigger to record points). Then you can check real deviations from the tip running this program.

In my practice, I'm mostly using tip-to-tip methods.



Quote: Will TwinTool gererate more reproducible/accurate results? Is it possible to get a trial license for a week?


Are you have a linear计传感器?

You can find more info below the link:
//www.sinclairbody.com/doc/en/Robot-Automatic-Calibration-TwinTool.html
#9
(02-01-2023, 08:55 AM)Sergei Wrote: Are you making rotations using a teach pendant?

Yes, I did. But I also checked the cartesian position and it did not change.


Quote: Are you have a linear gage sensor?

Not yet, but I would lend a Keyence linear gage to test TwinTool.
#10
It is better if you contact us by email at info@www.sinclairbody.com to help you test TwinTool.

This is possible using a Keyence sensor with a flat contact point (25 mm recommended) and connected to a computer using USB, Socket or Ethernet/IP.




Users browsing this thread:
1 Guest(s)