Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using full vector from mdp travel service #156

Merged
merged 6 commits into from
Apr 9, 2015

Conversation

hawesie
Copy link
Member

@hawesie hawesie commented Apr 2, 2015

This should speed up scheduling quite a lot since the proper edge statistics stuff was introduced.

@hawesie
Copy link
Member Author

hawesie commented Apr 5, 2015

Needs merging soon.

@marc-hanheide
Copy link
Member

Has this been tested by @Jailander or anyone else at UOL? Please report findings here.

@bfalacerda
Copy link
Contributor

Since I added this one I started witnessing #157 . I'm not sure if there's any relation, probably not, but if UOL can run the version without it and check if the #157 occurs it would be nice

@marc-hanheide
Copy link
Member

@Jailander @cdondrup I know you're testing loads ATM, so can you give this one a go, too?

@cdondrup
Copy link
Member

cdondrup commented Apr 7, 2015

Will test it tomorrow when we have the info_terminal and walking_group running on schedule.

…t directly does not have concurrency issues with the other expected time call.
@hawesie
Copy link
Member Author

hawesie commented Apr 8, 2015

@bfalacerda changes are in and run fine here in simulation, but I'm probably missing stuff that is important on the robot.

@cdondrup
Copy link
Member

cdondrup commented Apr 8, 2015

When starting task-scheduler-mdp.launch

process[frenap-1]: started with pid [805]                                                                                                         │·································································
process[special_waypoints_manager-2]: started with pid [845]                                                                                      │·································································
process[mdp_travel_time_estimator-3]: started with pid [853]                                                                                      │·································································
process[mdp_policy_executor-4]: started with pid [871]                                                                                            │·································································
process[schedule_server-5]: started with pid [913]                                                                                                │·································································
[INFO] [WallTime: 1428504563.943961] Topological MDP initialised                                                                                  │·································································
[ INFO] [1428504563.960160345]: Writing scheduling problems to mongodb_store                                                                      │·································································
[ INFO] [1428504563.974191950]: Ready to serve schedules                                                                                          │·································································
process[scheduled_task_executor-6]: started with pid [994]                                                                                        │·································································
[INFO] [WallTime: 1428504564.264960] Topological MDP initialised                                                                                  │·································································
process[wait_node-7]: started with pid [1061]                                                                                                     │·································································
process[gcal_routine-8]: started with pid [1067]                                                                                                  │·································································
[INFO] [WallTime: 1428504565.309888] Waiting for task_executor service...                                                                         │·································································
PRISM server running on port 8086                                                                                                                 │·································································
PRISM server running on port 8085                                                                                                                 │·································································
Traceback (most recent call last):                                                                                                                │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/mdp_travel_time_estimator.py", line 57, in <module>                 │·································································
    mdp_estimator =  MdpTravelTimeEstimator(filtered_argv[1])                                                                                     │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/mdp_travel_time_estimator.py", line 25, in __init__                 │·································································
    self.prism_estimator=PrismJavaTalker(8085,self.directory, self.file_name)                                                                     │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/src/mdp_plan_exec/prism_java_talker.py", line 20, in __init__               │·································································
    self.sock.connect((HOST, PORT))                                                                                                               │·································································
  File "/usr/lib/python2.7/socket.py", line 224, in meth                                                                                          │·································································
    return getattr(self._sock,name)(*args)                                                                                                        │·································································
socket.error: [Errno 111] Connection refused                                                                                                      │·································································
Traceback (most recent call last):                                                                                                                │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/mdp_policy_executor.py", line 200, in <module>                      │·································································
    mdp_executor =  MdpPolicyExecutor(filtered_argv[1])                                                                                           │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/mdp_policy_executor.py", line 48, in __init__                       │·································································
    self.prism_policy_generator=PrismJavaTalker(8086,self.directory, self.file_name)                                                              │·································································
  File "/localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/src/mdp_plan_exec/prism_java_talker.py", line 20, in __init__               │·································································
    self.sock.connect((HOST, PORT))                                                                                                               │·································································
  File "/usr/lib/python2.7/socket.py", line 224, in meth                                                                                          │·································································
    return getattr(self._sock,name)(*args)                                                                                                        │·································································
socket.error: [Errno 111] Connection refused                                                                                                      │·································································
[mdp_travel_time_estimator-3] process has died [pid 853, exit code 1, cmd /localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/m$│·································································
p_travel_time_estimator.py WW_GF_2015_02_22 __name:=mdp_travel_time_estimator __log:=/localhome/strands/.ros/log/957ae716-ddd8-11e4-91bb-00032d22$│·································································
887/mdp_travel_time_estimator-3.log].                                                                                                             │·································································
log file: /localhome/strands/.ros/log/957ae716-ddd8-11e4-91bb-00032d225887/mdp_travel_time_estimator-3*.log                                       │·································································
[mdp_policy_executor-4] process has died [pid 871, exit code 1, cmd /localhome/strands/aaf_ws/src/strands_executive/mdp_plan_exec/scripts/mdp_pol$│·································································
cy_executor.py WW_GF_2015_02_22 __name:=mdp_policy_executor __log:=/localhome/strands/.ros/log/957ae716-ddd8-11e4-91bb-00032d225887/mdp_policy_ex$│·································································
cutor-4.log].                                                                                                                                     │·································································
log file: /localhome/strands/.ros/log/957ae716-ddd8-11e4-91bb-00032d225887/mdp_policy_executor-4*.log

task-scheduler-top.launch does not have an argument called topological_map which makes both launch files not interchangeable.

@cdondrup
Copy link
Member

cdondrup commented Apr 8, 2015

When trying to use the top version of the scheduler, it starts:

[INFO] [WallTime: 1428505627.743940] adding 594jrqde9m56d0jhc9fc8cvov42015-04-08T15:06:46.549Z to the waiting queue. Now we have 1 tasks in the w$│·································································
iting queue.                                                                                                                                      │·································································
[INFO] [WallTime: 1428505627.744359] adding p37m4ckga5s55jsjalrvrctipg2015-03-17T09:21:14.506Z to the waiting queue. Now we have 2 tasks in the w$│·································································
iting queue.                                                                                                                                      │·································································
[INFO] [WallTime: 1428505627.746873] updating from google calendar https://www.googleapis.com/calendar/v3/calendars/hanheide.net_6hgulf44ij7ctjrf$│·································································
iscj0m24o@group.calendar.google.com/events?key=AIzaSyC1rqV2yecWwV0eLgmoQH7m7PdLNX1p6a0&singleEvents=true&orderBy=startTime&maxResults=2500        │·································································
[INFO] [WallTime: 1428505637.629838] task 594jrqde9m56d0jhc9fc8cvov42015-04-08T15:06:46.549Z to start after 2015-04-08 16:12:00 should be schedul$│·································································
d now.                                                                                                                                            │·································································
[INFO] [WallTime: 1428505637.630198] removing 594jrqde9m56d0jhc9fc8cvov42015-04-08T15:06:46.549Z from the waiting queue. Now we have 1 tasks in t$│·································································
e waiting queue.                                                                                                                                  │·································································
[INFO] [WallTime: 1428505637.630420] removed outdated task p37m4ckga5s55jsjalrvrctipg2015-03-17T09:21:14.506Z that was planned to end until 2015-$│·································································
3-17 10:00:00                                                                                                                                     │·································································
[INFO] [WallTime: 1428505637.630626] removing p37m4ckga5s55jsjalrvrctipg2015-03-17T09:21:14.506Z from the waiting queue. Now we have 0 tasks in t$│·································································
e waiting queue.                                                                                                                                  │·································································
[INFO] [WallTime: 1428505637.630813] Sending 1 tasks to the scheduler                                                                             │·································································
[INFO] [WallTime: 1428505637.662743] Got a further 1 tasks to schedule                                                                            │·································································
[INFO] [WallTime: 1428505637.683436] Waiting for topological_navigation/travel_time_estimator                                                     │·································································
[INFO] [WallTime: 1428505637.685430] ... and got topological_navigation/travel_time_estimator 

So it updates correctly from google calendar and sends it to the scheduler. Afterwards I run

strands@linda:~/aaf_ws$ rosservice call /task_executor/set_execution_status "status: true"                                                        │·································································
previous_status: False                                                                                                                            │·································································
success: True                                                                                                                                     │·································································
remaining_execution_time:                                                                                                                         │·································································
  secs: 0                                                                                                                                         │·································································
  nsecs: 0

but /current_schedule doesn't publish anything and nothing happens either.

@cdondrup
Copy link
Member

cdondrup commented Apr 8, 2015

I think I found where it dies. I removed the task from the calendar, started the schedule in the top mode again. Set it to execute and the it started publishing an empty schedule under /current_schedule. I then created a task in the calendar and the following happened:

[INFO] [WallTime: 1428506283.575127] updating from google calendar https://www.googleapis.com/calendar/v3/calendars/hanheide.net_6hgulf44ij7ctjrf2│·································································
iscj0m24o@group.calendar.google.com/events?key=AIzaSyC1rqV2yecWwV0eLgmoQH7m7PdLNX1p6a0&singleEvents=true&orderBy=startTime&maxResults=2500        │·································································
[INFO] [WallTime: 1428506283.764972] changes in the calendar to process +1 -0                                                                     │·································································
[INFO] [WallTime: 1428506283.765847] instantiate mumd1em5no2snes3esssb7df842015-04-08T15:17:43.718Z                                               │·································································
task_id: 0                                                                                                                                        │·································································
start_node_id: WayPoint9                                                                                                                          │·································································
end_node_id: WayPoint9                                                                                                                            │·································································
action: walking_group_slow                                                                                                                        │·································································
start_after:                                                                                                                                      │·································································
  secs: 0                                                                                                                                         │·································································
  nsecs: 0                                                                                                                                        │·································································
end_before:                                                                                                                                       │·································································
  secs: 0                                                                                                                                         │·································································
  nsecs: 0                                                                                                                                        │·································································
max_duration:                                                                                                                                     │·································································
  secs: 3600                                                                                                                                      │·································································
  nsecs: 0                                                                                                                                        │·································································
execution_time:                                                                                                                                   │·································································
  secs: 0                                                                                                                                         │·································································
  nsecs: 0                                                                                                                                        │·································································
arguments: []                                                                                                                                     │·································································
[INFO] [WallTime: 1428506283.774668] adding mumd1em5no2snes3esssb7df842015-04-08T15:17:43.718Z to the waiting queue. Now we have 1 tasks in the wa│·································································
iting queue.                                                                                                                                      │·································································
[INFO] [WallTime: 1428506293.320057] task mumd1em5no2snes3esssb7df842015-04-08T15:17:43.718Z to start after 2015-04-08 16:20:00 should be schedule│·································································
d now.                                                                                                                                            │·································································
[INFO] [WallTime: 1428506293.320485] removing mumd1em5no2snes3esssb7df842015-04-08T15:17:43.718Z from the waiting queue. Now we have 0 tasks in th│·································································
e waiting queue.                                                                                                                                  │·································································
[INFO] [WallTime: 1428506293.320715] Sending 1 tasks to the scheduler                                                                             │·································································
[INFO] [WallTime: 1428506293.341972] Got a further 1 tasks to schedule                                                                            │·································································
[INFO] [WallTime: 1428506293.365584] Waiting for topological_navigation/travel_time_estimator                                                     │·································································
[INFO] [WallTime: 1428506293.368002] ... and got topological_navigation/travel_time_estimator

And nothing ever happened thereafter. Also /current_schedule is dead. Something seems to go wrong after getting the travel time estimation. Sorry for all the spam. Any idea @hawesie ?

@hawesie
Copy link
Member Author

hawesie commented Apr 8, 2015

See if you have any other instances of PRISM running and then kill them if you do. The first time I run the mdp on any system it gives me an error, but then runs next time. I have a suspicion on the other issue, but let's try to get the mdp bit working first as that's the intended system.

@bfalacerda
Copy link
Contributor

this has been working in bob throughout the day.

@cdondrup , the issues you posted don't seem to be directly to this PR, so it is ok if I merge it? Is the issue with launching task-scheduler-mdp.launch solved?

@cdondrup
Copy link
Member

cdondrup commented Apr 8, 2015

I'm in the process of trying it, but there apparently was a prism process running and now it launched. Can't tell if there is any thing intelligent happening though. Have to tie up a few loose ends first.

@bfalacerda
Copy link
Contributor

ok, i'll wait for some feedback then

@cdondrup
Copy link
Member

cdondrup commented Apr 8, 2015

OK, so the top version seems to work if there is task schedule not via the google interface. at least we restarted everything and tried the top version with info_terminal and that worked. I think you can merge this, once the tests pass, and I will keep on trying the mdp version tomorrow.

@marc-hanheide
Copy link
Member

retest this please

@marc-hanheide
Copy link
Member

As said here: #155 the test is itself unstable, it seems.

hawesie added a commit that referenced this pull request Apr 9, 2015
Using full vector from mdp travel service
@hawesie hawesie merged commit d460a85 into strands-project:hydro-release Apr 9, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants