-
Notifications
You must be signed in to change notification settings - Fork 31
Home
For the final deliverable, according to MacCallister Higgins of Udacity in #didi-channel at https://carnd.slack.com/archives/C4FJZ0JH3/p1489428114341653:
We will be releasing additional training and test datasets in both ROS bag and Kitti format. We are currently talking directly with the people behind the Kitti data/challenges and we both agree that ROS bags are now the better format for data release, but we'd need to potentially modify their evaluation framework (see their dev kit) to accomodate that. The challenge page currently also incorrectly states that new datasets will have stereo camera imagery since we included the Kitti dataset format for reference, but our data will only have a single forward facing monocular camera. We are currently pointing competitors towards the Kitti datasets to get started immediately if they want, but your solution shouldn't be so narrow as to only apply to those datasets. Monocular camera, LIDAR, and Radar data will be provided, so you should currently approach the problem at a higher level. Solutions will need to run in a ROS node, meaning that either C/C++ or Python can be used, but since there are real-time performace constraints, we would suggest a final implementation in C/C++."
Also from the #didi-channel at https://carnd.slack.com/archives/C4FJZ0JH3/p1489431338466233:
vivek1108 [2:55 PM] Ok, so in submission we submit a csv, or a notebook/readme with code and testing details?
mac [2:56 PM] Yes, the finalists will be responsible for transmitting code with dependency/install instructions so that we can replicate results and verify