02-09-2022, 02:04 PM(This post was last modified: 02-15-2022, 12:04 PM byJeremy.)
Hi, In simulation a milling program with a Fanuc robot on a rail works fine. When I generate the program files and open an LS-file there are positions which are impossible for the robot to reach. When I try to run the file on the robot controller I get an error "Position is unreachable". Joint-positions are no problem but cartesian positions are unreachable for the robot (e.g. X= -3979 mm, in UF_9). It looks as if the positions are calculated with the wrong Frame as a reference? But then again, why is the simulation in RoboDK running like a charm? UF_9 = -2315, 1697, -217, 90, 0, 0 TF_9 = 240, 0, 50, 0, 90, 0
See the following part of the .LS-file. /POS P[1]{ GP1: UF : 9, UT : 9, J1= -77.081 deg, J2= -27.093 deg, J3= 14.298 deg, J4= -160.426 deg, J5= -63.759 deg, J6= 4.452 deg, E1= -6000.000 mm }; P[2]{ GP1: UF : 9, UT : 9, CONFIG : 'N U T, 0, 0, 0', X = -3979.380 mm, Y = 2539.940 mm, Z = 296.944 mm, W = 176.899 deg, P = -8.453 deg, R = -69.771 deg, E1= -6000.000 mm }; P[3]{ GP1: UF : 9, UT : 9, CONFIG : 'N U T, 0, 0, 0', X = -3961.070毫米,Y = 2558.020毫米,Z = 294.081毫米, W = 176.899 deg, P = -8.453 deg, R = -69.771 deg, E1= -6000.000 mm };
Many reasons could explain this, for example, the origin of the setup in the controller might not match with the origin of the setup in RDK (origin of the rail). Another possibility is the cartesian direction of the rail. By default in RDK it's along the X, but some axes are set along the Y.
If you bring the real robot so a set of joint values and you move the robot in RDK to the same joint values, does the cartesian position match? Sae if you move the rail, does it change the cartesian value along the X axis on the teach pendant?
Jeremy
Find useful information about RoboDK and its features by visiting our2022世界杯32强赛程表时间
and by watching tutorials on ourYoutube Channel.
02-10-2022, 11:51 PM(This post was last modified: 02-11-2022, 12:52 AM byEBri.)
It's 1:45 o clock here, so I don't have a real robot to test with right now but I made something in RoboGuide to check your approach. While doing so, I noticed that the milling-curves move with the robot when E1 is traversing. When I look at the project tree in RoboDK I don't see why this is happening because all the 'curves' are in Frame_L4 which does not have the Fanuc Frame as a base but the stationary 'Frame_Opspanframe'. Please find a few images attached.
This means stuff are not attached to the proper frame in the controller. 你知道什么是静态帧(铁路框架)?What number it is in fact?
First thing, looking at your RDK station, all your target/path should part of the rail frame or a frame child to the rail. To fix that, simply right click the first frame ->"change support"->"Frame rail"
Also, if I remember correctly there's a hidden feature with Fanuc, if you add an ID to the name of the frame, (eg "Frame Part 4", where "4" is the ID), it won't generate the path in UFRAME 9, it will simply refer to UFRAME 4 setup in the controller.
Jeremy
Find useful information about RoboDK and its features by visiting our2022世界杯32强赛程表时间
and by watching tutorials on ourYoutube Channel.
02-13-2022, 10:52 AM(This post was last modified: 02-13-2022, 12:45 PM byEBri.)
As far as I know the post-processor of RDK generates only two frames for the controller/RoboGuide: ToolFrame_9 and UserFrame_9. (Correct me if I'm wrong). I moved the Frame_L4 back to the Frame of the rail (Frame_Mechanism_Base), unfortunately the milling-curves still moved with the robot.
Good to know about that 'hidden feature', Thanks.
In the properties window of UFrame_9, the frame is 'attached' to the robot. In this properties window of the UFrame it was not possible to turn this option Off. I found out (via a forum) that this can be solved by modifying the extended axis. When an extended axis in group 1 is set as 'Auxiliary' then the User Frame is attached to the robot and cannot be changed. When the extended axis is set as an "integrated axis" it is possible to attach the User Frame to other items e.g. the Rail base. Changing this setting can be done after a Controlled Start of the controller: [Menu]->Maintenace etc.
Now the extended axis is set as an integrated axes and the user frame is attached to the rail-base, the User frame is stationary at the correct location. But, it ain't over until it's over: When the rail is at position E1=0.0mm the green sphere at the TCP moves correctly with every move of the robot. When the robot on the rail (E1) is jogged in positive direction, the green sphere moves the exact same distance in the negative direction. The toolframe remains at the flange of the robot. I'm looking for a solution for this problem. Suggestions are very welcome.
Great debugging. I know it never goes fast enough, but step by step is the only way. And I really appreciate you took the time to properly explain the steps you had to take to make the frame attached to the rail frame and not the robot frame.
Can you tell me more regarding the other problem you solved?
Jeremy
Find useful information about RoboDK and its features by visiting our2022世界杯32强赛程表时间
and by watching tutorials on ourYoutube Channel.
02-15-2022, 09:26 AM(This post was last modified: 02-15-2022, 10:59 AM byEBri.)
Hi, At my customer I changed the robot configuration, like I did in RoboGuide, and now the real robot has it's extended axis set as an 'integrated rail axis' (not as an auxiliary rail axis). The robot moves to the correct X-position and from there it can follow the curves. One more (/last) step to take: 在RoboDK生成的文件,一些职位re expressed in joint-coordinates and most in Cartesian coordinates. In the following text I look at two positions, which are close to each other (in the real world): - P[1] expressed in joint coordinates (J move), - P[2] expressed in Cartesian coordinates (L move).
With the modified controller settings, both positions are now reachable for the robot but positions expressed in joint coordinates are positioned at the other side of the rail than the coordinates that are expressed in Cartesian coordinates. See the attached image "P1andP2.jpg". In this image I converted the expression of P[2] into Joint coordinates so it can be compared to with P[1] more easily. J1 makes a rotation of approximately 180 degrees in order to move from P[1] to P[2]. When both positions are converted to Cartesian coordinates the problem seems to be in the Z-coordinate: P[1]: Z = 2951 mm (not the correct position) P[2]: Z = 302 mm (correct distance w.r.t. UserFrame_9) See image "cartesianP1andP2.jpg". Tests with the real robot show similar results only there the problem is inverted: the joint positions are at the right side of the rail and the Cartesian positions are not.
In RoboGuide: The UF_9 seems to be shifted in Z-direction over a distance of (2951 - 302 =) 2649 mm. In the real robot: Jogging in UF_9 results in inverted directions for X and Z, so the frame seems to be rotated 180 degrees around the Y-axis (of UF_9) and is probably also shifted.
I'm looking for the cause(s) for the differences between the simulation in RoboDK, RoboGuide and the behaviour of the real robot. Probably involved are: - The coordinate-system of the 3D model of the robot(base plate) in RoboGuide is 180 rotated w.r.t. the 3D model in RoboDK. In both systems I chose the X-directions of the robot-frames in the same direction (but the connectors of the base plates are at opposite sides). - The real robot is mounted 90 degrees rotated w.r.t. the 3D model in RoboDK but it's calibration is (historically) such that the arm direction at J1=0.0 is the same as in RoboDK with J1=0.0. If you have any suggestions, I'm glad to hear from you!
02-15-2022, 12:13 PM(This post was last modified: 02-15-2022, 12:14 PM byJeremy.)
This looks like an issue with the way the robot was installed on the linear rail in RoboDK.
Quote:
Tests with the real robot show similar results only there the problem is inverted: the joint positions are at the right side of the rail and the Cartesian positions are not.
Even between the real robot controller and Roboguide, there's a setting discrepancy.
Can you provide your .rdk station. I will try to solve the issues related to the frame position/orientation. You can always send them via email if you can't post a striped down version here.
As for this part, I think I understand, but I'm still confused.
Quote:
The real robot is mounted 90 degrees rotated w.r.t. the 3D model in RoboDK but it's calibration is (historically) such that the arm direction at J1=0.0 is the same as in RoboDK with J1=0.0.
Edit: Could it be that the joint direction is inverted? Insteadd of the robot being rotated?
Jeremy
Find useful information about RoboDK and its features by visiting our2022世界杯32强赛程表时间
and by watching tutorials on ourYoutube Channel.