Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tmp #1

Open
HiroIshida opened this issue Jan 27, 2020 · 14 comments
Open

tmp #1

HiroIshida opened this issue Jan 27, 2020 · 14 comments

Comments

@HiroIshida
Copy link
Owner

HiroIshida commented Jan 27, 2020

introduction to robot model

before startgin this tutorial, please do roscore in other terminal.

get information of joint angles of a robot

(load "package://fetcheus/fetch-interface.l") 
(fetch) 

The fetch function is create an instance named *fetch* which is a geometrical model of fetch-robot. Now you can view the model by calling the following function:

(objects *fetch*)
none

The robot model *fetch* contains the information of joints. The fetch robot has 10 joints, so let's look at state of these joints.

(send *fetch* :angle-vector)
;; output: #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0)

As you can see, the state of 10 joints are shown as a float-vector. Probably you need to know which value corresponds to which joints. To this end, the following method is useful.

(send *fetch* :joint-list :name)
;; output: ("torso_lift_joint" "shoulder_pan_joint" "shoulder_lift_joint" "upperarm_roll_joint" "elbow_flex_joint" "forearm_roll_joint" "wrist_flex_joint" "wrist_roll_joint" "head_pan_joint" "head_tilt_joint")

You get the list of 10 strings of joint name. The order of this list corresponds to the order of the float-vector you got by :angle-vector method. By comparing those two, for example, you know that torso_lift_joint has angle of 20.0.

Now, let's set a custom angle vector to the robot model.

(setq *av-zero* (float-vector 0 0 0 0 0 0 0 0 0 0))
(send *fetch* :angle-vector *av-zero*)

Please click the IRT-viewer previously opend by objects function, then the you will see the robot model is updated.

none

Maybe, you want to set specific joint instead of set all the joints angle at once. For the shoulder_pan_joint case for example, this can be done by:

(let ((shoulder-pan-joint (send *fetch* :shoulder_pan_joint)))
    (send shoulder-pan-joint :joint-angle 60))

Note that the same thing can be done by (setq *fetch* :shoulder_pan_joint :joint-angle 60), which is more common in jsk. You will get following image:

none
You will observe that only the state of the single joint is changed by compareing this figure and previous one.

solving inverse kinematics (IK)

Usually, in robotics, you want to guide the robot arm's end effector to a commanded pose (position and orientation). Thus, before sending an angle vector, you must know an angle vector with which the end effector will be the commanded pose. This can be done by solving inverse kinematics (IK) (if you are not familiar please google it). First, we create a coordinate (or a pose) *co* by

(setq *co* (make-coords :pos #f(800 300 800) :rpy #f(0.0 0.1 0.1))) ;; #f(..) is a float-vector

Then the following code will solver the IK:

(send *fetch* :angle-vector #f(0 0 0 0 0 0 0 0 0 0))
(send *fetch* :rarm :inverse-kinematics *co*
        :rotation-axis t :check-collision t :use-torso nil)
none

The above function first solved IK inside to obtain the angle-vector such that the coordinate of the end-effector equals to *co*. Then set the obtained angle-vector to *fetch*. (I personally think this function is bit strange and conufsing. (For me it is more straightforward if the function return just sangle-vector without setting it *fetch*.) Note that you must care initial solution for IK. In the euslisp the angle-vector set to the robot is used as the initial solution. For example, in the above code I set it to " #f(0 0 0 0 0 0 0 0 0 0)" . In solving IK you can set some key arguments. :rotation-axis, check-collision and use-toros is particurally important. If :rotation-axis is nil the IK is solved ignoreing orientation (rpy). If :check-collision is nil the collision between the links of robot is not considered. Please play with changing these arguments.

Noting that iterative optimization takes place in solving IK, the solution will be change if you change the initial solution. Let's try with different initial solution:

(send *fetch* :angle-vector #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0))
(send *fetch* :rarm :inverse-kinematics *co*
        :rotation-axis t :check-collision t :use-torso nil)

Now you will see different solution is obtained from the previous one.

none

Now let's check that actually inverse kinematics is solved by displaying *co* and the coordinate of end-effector *co-endeffector*.

(setq *co-endeffector* (send (send *fetch* :rarm :end-coords) :copy-worldcoords)) 
(objects (list *fetch* *co* *co-endeffector*))
none
You can see the two coordinates (diplayed by white arrows) are equal to each other.

move-end-pos

@HiroIshida
Copy link
Owner Author

(load "package://pr2eus/pr2-interface.l")
(ros::load-ros-manifest "roseus")
(ros::roseus "tracker" :anonymous t)
(pr2-init)

(setq camera-frame "/head_mount_kinect_rgb_optical_frame")
(setq tfl (instance ros::transform-listener :init))
(setq msg nil)
(setq target-coords-base nil) ;; wrt base_link
(setq target-coords-camera nil) ;; wrt base_link
(ros::subscribe "/object_pose" geometry_msgs::PoseStamped
#'(lambda (msg)
(setq msg msg)
(print "subscribed")
(let* ((pose (send msg :pose))
(lt (send tfl :lookup-transform
"base_link" camera-frame
(ros::time 0))))
(setq target-coords-base (send lt :transform (ros::tf-pose->coords pose)))
(setq target-coords-camera pose))))

(defun compute-pan-angle (co-base)
(let* ((pos (send co-base :pos))
(pan-angle (atan (aref pos 1) (aref pos 0)))
(pan-angle-deg (/ (* pan-angle 180) 3.14)))
pan-angle-deg))

(setq node-state 'init)

(defun handle-start (req)
(let ((res (send req :response)))
(speak-jp "ほげ")
(setq node-state 'running)
res))

(defun handle-terminate (req)
(let ((res (send req :response)))
(setq node-state 'terminated)
res))

(ros::advertise-service "start_tracking" std_srvs::Empty #'handle-start)
(ros::advertise-service "terminate_tracking" std_srvs::Empty #'handle-terminate)

(defun tracking ()
(unix:sleep 2)
(ros::spin-once)
(speak-en "start tracking")
(loop
(unless (null target-coords-base)
(ros::spin-once)
(send pr2 :angle-vector (send ri :state :potentio-vector))
(setq angle-pan (compute-pan-angle target-coords-base))
(print angle-pan)
(send pr2 :head_pan_joint :joint-angle angle-pan)
(send pr2 :head_tilt_joint :joint-angle 44.0)
(send ri :angle-vector (send pr2 :angle-vector) 2000)
(unix:usleep 500000)
)
(when (eq node-state 'terminated) (return))
))

(unix:sleep 2)
(do-until-key
(ros::spin-once)
(print "hoge")
(unless (eq node-state 'init) (return)))
(tracking)

@HiroIshida
Copy link
Owner Author

#!/usr/bin/env python
import rospy
from posedetection_msgs.msg import ObjectDetection

topic_name = "/ObjectDetection"
topic_type = ObjectDetection

def callback(msg):
#rospy.loginfo("subscribing objectDetection")
pass

rospy.init_node('dummy_subscriber')
rospy.Subscriber(topic_name, topic_type, callback)
rospy.spin()

@HiroIshida
Copy link
Owner Author

HiroIshida commented Mar 9, 2020

catkin build roseus_mongo
catkin build active_visual_perception
catkin build feedback_grasp
catkin build force_proximity_ros (branch hard-ccrding 2346de9
catkin build image_collision_detection
catkin build gpactive_ros
catkin build tbtop_square
catkin build particle_filter

@HiroIshida
Copy link
Owner Author

HiroIshida commented Mar 11, 2020

git clone https://github.com/HiroIshida/FA-I-sensor.git
catkin build force_proximity_ros

git clone https://github.com/HiroIshida/tbtop_square.git
catkin build tbtop_square

git clone https://github.com/HiroIshida/julia_active.git
catkin build julia_active

git clone https://github.com/HiroIshida/oven.git
catkin build oven

git clone https://github.com/HiroIshida/trajectory_tweak.git
catkin build trajectory_tweak

git clone jsk_pr2eus
caktin build pr2eus

git clone jsk_roseus
catkin build roseus_mongo

@HiroIshida
Copy link
Owner Author

introduction to robot model

before startgin this tutorial, please do roscore in other terminal.

(load "package://fetcheus/fetch-interface.l") 
(fetch) 

The fetch function is create an instance named *fetch* which is a geometrical model of fetch-robot. Now you can view the model by calling the following function:

(objects *fetch*)

The robot model *fetch* contains the information of joints. The fetch robot has 10 joints, so let's look at state of these joints.

(send *fetch* :angle-vector)
;; output: #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0)

As you can see, the state of 10 joints are shown as a float-vector. Probably you need to know which value corresponds to which joints. To this end, the following method is useful.

(send *fetch* :joint-list :name)
;; output: ("torso_lift_joint" "shoulder_pan_joint" "shoulder_lift_joint" "upperarm_roll_joint" "elbow_flex_joint" "forearm_roll_joint" "wrist_flex_joint" "wrist_roll_joint" "head_pan_joint" "head_tilt_joint")

@HiroIshida
Copy link
Owner Author

HiroIshida commented Apr 16, 2020

introduction to robot model

before startgin this tutorial, please do roscore in other terminal.

get information of joint angles of a robot

(load "package://fetcheus/fetch-interface.l") 
(fetch) 

The fetch function is create an instance named *fetch* which is a geometrical model of fetch-robot. Now you can view the model by calling the following function:

(objects *fetch*)

You must see:

none

The robot model *fetch* contains the information of joints. The fetch robot has 10 joints, so let's look at state of these joints.

(send *fetch* :angle-vector)
;; output: #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0)

As you can see, the state of 10 joints are shown as a float-vector. Probably you need to know which value corresponds to which joints. To this end, the following method is useful.

(send *fetch* :joint-list :name)
;; output: ("torso_lift_joint" "shoulder_pan_joint" "shoulder_lift_joint" "upperarm_roll_joint" "elbow_flex_joint" "forearm_roll_joint" "wrist_flex_joint" "wrist_roll_joint" "head_pan_joint" "head_tilt_joint")

You get the list of 10 strings of joint name. The order of this list corresponds to the order of the float-vector you got by :angle-vector method. By comparing those two, for example, you know that torso_lift_joint has angle of 20.0.

Now, let's set a custom angle vector to the robot model.

(setq *av-zero* (float-vector 0 0 0 0 0 0 0 0 0 0))
(send *fetch* :angle-vector *av-zero*)

Maybe, you want to set specific joint instead of set all the joints angle at once. For the shoulder_pan_joint case for example, this can be done by:

(let ((shoulder-pan-joint (send *fetch* :shoulder_pan_joint)))
    (send shoulder-pan-joint :joint-angle 60))

The same thing can be done by (setq *fetch* :shoulder_pan_joint :joint-angle 60), which is more common in jsk.

solving inverse kinematics

Usually, in robotics, you want to guide the robot arm's end effector to a commanded pose (position and orientation). Thus, before sending an angle vector, you must know an angle vector with which the end effector will be the commanded pose. This can be done by solving inverse kinematics (if you are not familiar please google it). First, we create a coordinate (or a pose) *co* by
``
(setq co (make-coords :pos #f(800 0 1000) :rpy #f(0.0 0.3 1.54))) ;; #f(..) is a float-vector

Then the following code will solver the inverse kinematics.

(send fetch :rarm :inverse-kinematics
(send co-object :copy-worldcoords)
:rotation-axis t :check-collision t :use-torso nil)

Now you can see





@HiroIshida
Copy link
Owner Author

HiroIshida commented Apr 16, 2020

introduction to robot model

before startgin this tutorial, please do roscore in other terminal.

get information of joint angles of a robot

(load "package://fetcheus/fetch-interface.l") 
(fetch) 

The fetch function is create an instance named *fetch* which is a geometrical model of fetch-robot. Now you can view the model by calling the following function:

(objects *fetch*)
none

The robot model *fetch* contains the information of joints. The fetch robot has 10 joints, so let's look at state of these joints.

(send *fetch* :angle-vector)
;; output: #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0)

As you can see, the state of 10 joints are shown as a float-vector. Probably you need to know which value corresponds to which joints. To this end, the following method is useful.

(send *fetch* :joint-list :name)
;; output: ("torso_lift_joint" "shoulder_pan_joint" "shoulder_lift_joint" "upperarm_roll_joint" "elbow_flex_joint" "forearm_roll_joint" "wrist_flex_joint" "wrist_roll_joint" "head_pan_joint" "head_tilt_joint")

You get the list of 10 strings of joint name. The order of this list corresponds to the order of the float-vector you got by :angle-vector method. By comparing those two, for example, you know that torso_lift_joint has angle of 20.0.

Now, let's set a custom angle vector to the robot model.

(setq *av-zero* (float-vector 0 0 0 0 0 0 0 0 0 0))
(send *fetch* :angle-vector *av-zero*)

Please click the IRT-viewer previously opend by objects function, then the you will see the robot model is updated.

none

Maybe, you want to set specific joint instead of set all the joints angle at once. For the shoulder_pan_joint case for example, this can be done by:

(let ((shoulder-pan-joint (send *fetch* :shoulder_pan_joint)))
    (send shoulder-pan-joint :joint-angle 60))

The same thing can be done by (setq *fetch* :shoulder_pan_joint :joint-angle 60), which is more common in jsk.

solving inverse kinematics

Usually, in robotics, you want to guide the robot arm's end effector to a commanded pose (position and orientation). Thus, before sending an angle vector, you must know an angle vector with which the end effector will be the commanded pose. This can be done by solving inverse kinematics (if you are not familiar please google it). First, we create a coordinate (or a pose) *co* by
``
(setq co (make-coords :pos #f(800 0 1000) :rpy #f(0.0 0.3 1.54))) ;; #f(..) is a float-vector

Then the following code will solver the inverse kinematics.

(send fetch :rarm :inverse-kinematics
(send co-object :copy-worldcoords)
:rotation-axis t :check-collision t :use-torso nil)

Now you can see





@HiroIshida
Copy link
Owner Author

*co*

@HiroIshida
Copy link
Owner Author

The above function first solved IK inside to obtain the angle-vector such that the coordinate of the end-effector equals to *co*. Then set the obtained angle-vector to *fetch*. (I personally think this function is bit strange and conufsing. (For me it is more straightforward if the function return just sangle-vector without setting it *fetch*.) Note that you must care initial solution for IK. In the euslisp the angle-vector set to the robot is used as the initial solution. For example, int the above code I set it to " #f(0 0 0 0 0 0 0 0 0 0)" . Noting that iterative optimization takes place in solving IK, the solution will be change if you change the initial solution. Let's try with different initial solution:

(send *fetch* :angle-vector #f(20.0 75.6304 80.2141 -11.4592 98.5487 0.0 95.111 0.0 0.0 0.0))
(send *fetch* :rarm :inverse-kinematics *co*
        :rotation-axis t :check-collision t :use-torso nil)

Now you will see different solution is obtained from the previous one.

none

@HiroIshida
Copy link
Owner Author

HiroIshida commented Apr 24, 2020

I added quick tutorial using fetch. I try to design this tutorial to be minimum but enough to do something using real robot. Currently the tutorial includes (day1) basic usage for geometrical simulation only inside the euslisp and (day2) workflow to control real robot. I am planning to explain how to make simple tabletop demo in day3 (My DC2 submission is 5/8, so I will work on day3 part in the next week of the submission)

You may wonder why I made this tutorial from scratch rather than taking advantage of pr2eus_tutorials and jsk_fetch_gazebo_demo, so I will explain the reason here.

Actually, making a tutorial for pr2eus_tutorials was my first attempt. But simulating PR2 is too slow (2x slower than fetch), and even when I deleted camera and depth sensor stuff from the urdf file and launch it, it didn't make significant difference (of course, a little bit faster though). The bottleneck seems to be in the dynamical simulation and not in image (depth) rendering. Thus I can imagine that even if one use GPU the situation will not change so much. I think especially for students who cannot use real robot and do everything in the simulator, this slow simulation must be a nightmare. So I gave up on PR2.

Then my second attempt was making step-by-step tutorial for jsk_fetch_gazebo_demo. It was a great demo but it is not minimum. In that demo, fetch starts from a location away from a desk, and because it use MCL to navigate, it takes really long time to reach the desk. (This is why I make this comment)

I think manipulating object on table is good starting point. So, I am trying to make a tabletop demo. Unlike fetch_gazebo_demo, the robot is spawn just in front of the table and ready-to-start to grasp object.

@HiroIshida
Copy link
Owner Author

HiroIshida commented May 27, 2020

電子レンジのハンドル部分の隙間は狭く, 電子レンジの奥行方向の位置推定誤差があると, ドア開けタスクに失敗しやすくなります. 本実験では, この奥行方向の位置推定誤差にロバストな物体操作軌道を学習により獲得させようとしました. 奥行き方向の誤差のみを考えているのでFeasible regionの次元は1次元, つまりm=1です.

まず初期軌道は, 人間の教示により与えました. その後100回の試行により学習させます. 学習後の軌道はこのようになりました. 電子レンジ底面にひっかけて, すべらせるような動きが獲得され, 直感的には, 奥行方向の位置誤差にロバストな軌道であるように見えます.

学習によりロバスト性が向上したことを定量的に示すために, 位置推定誤差をすこしずつずらして与え, その誤差のもとでもロボットがドア開けに成功するかを調べる実験を行いました. 青の点が成功を表しているので, 学習後には許容可能な誤差の領域が大幅に拡大していることがわかります.

@HiroIshida
Copy link
Owner Author

HiroIshida commented May 27, 2020

さらに, 近接センサによるフィードバックを用いた把持タスクにも提案手法を適用しました. この研究は, robomechへの論文提出後に行ったものですが, 紹介したいと思います. 左図のようにロボットハンドの先に2つの近接センサをつけます. そして, センサ値に応じて, グリッパの並進と回転を制御する問題を考えます. フィードバック則は12次元のパラメータでparameterizeされているとします. 今回は二次元の位置推定誤差を考慮しました. この動画は, 同様のタスクを人間である私が目を閉じて触覚フィードバックを用いてやってみた様子です.

まず学習前の初期パラメータは私がハンドチューニングしました. ある位置推定誤差をいれてロボットがそれでも掴めるかどうかを, 学習前のパラメータを用いて行ったみたデモがこちらになります. このように近接フィードバックがうまくできていないため, 把持に失敗してしまっています. 学習のための試行は500回行いました. 次に, 全く同じだけの位置推定誤差をいれたあとに, 学習後のパラメータを用いて, つかめるかどうか試した様子が次のようになります. ひねりをきかして掴む動作により, 学習前では対応できなかった誤差に対応できるようになっていることがわかります.

電子レンジのドア開けタスクのときと同様に, すこしずつ推定誤差の値をずらして, それでも掴めるか否かを学習前, 学習後, それぞれのパラメータについて実験したものがこちらになります. 緑色の領域が学習前に成功した領域, 緑と青を足した領域が学習後に成功した領域です. 確かに, 学習後には許容可能な誤差の領域が拡大していることが確認できます. 一方, ハンドチューニングした初期パラメータの性能が良すぎたため, 学習による改善がわかりづらいのが, 本実験の残念なポイントです.

@HiroIshida
Copy link
Owner Author

HiroIshida commented May 28, 2020

基本的には次元が変わってもアルゴリズムは同じなので, 簡単のため1次元の誤差と, 1次元のパラメータのケースで説明します. この図の横軸がパラメータ, 縦軸が誤差を表しています.

を考えていましたが, ここでは図示しやすくするため, 1次元の

@HiroIshida
Copy link
Owner Author

それぞれのサンプルは実際にロボットを用いた試行により得られます. 例えばこの失敗のラベルがついたk番目のサンプルですと, e_k=1.3に対応する認識誤差に基づいてtheta_k = -0.4に対応する物体操作軌道を, ロボットに実行させ, 失敗か成功かを判定することで得られます.

ポスター前半でも紹介したように, 提案アルゴリズムの目的は, FRのサイズを最大化するような軌道パラメータを求めることでした. 今回のトイモデルですと, theta=0が最適値となります. 初期サンプルを与え, そこからアクティブサンプリングによりこの最適値を求めることを考えていきます.

そして, アクティブサンプリングにより, これまで得られたデータから次に試行するパラメータと誤差の組を求め, それに対してロボットが試行をしてラベルを得るという流れを繰り返します.

最終的には真の最適パラメータの周囲にサンプルが収束します. 真の最適パラメータ意外にはサンプルが集中せず, 効率的なサンプリングが実現されています. テクニカルな話なので割愛しますが, 適応的にSVMのカーネル幅を更新していることも本アルゴリズムの押しポイントとなっています 最後にパラメータの最適値の推定値〜を得ます. これでアルゴリズムの目的が達成されました.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant