diff --git a/docs/activity_definition.md b/docs/activity_definition.md new file mode 100644 index 00000000..cdea7178 --- /dev/null +++ b/docs/activity_definition.md @@ -0,0 +1,139 @@ +# Activity definition + +## Summary + +BDDL activities are defined by a set of **objects** in a scene, a ground **initial condition** that the scene configuration satisfies when the agent starts the activity, and a **goal condition** logical expression that the scene configuration must satisfy for the agent to reach success. The following example demonstrates this: + +``` +(define + (problem cleaning_the_pool_simplified) + (:domain igibson) + + (:objects + pool.n.01_1 - pool.n.01 + floor.n.01_1 - floor.n.01 + scrub_brush.n.01_1 - scrub_brush.n.01 + sink.n.01_1 - sink.n.01 + agent.n.01_1 - agent.n.01 + ) + + (:init + (onfloor pool.n.01_1 floor.n.01_1) + (stained pool.n.01_1) + (onfloor scrub_brush.n.01_1 floor.n.01_1) + (inroom floor.n.01_1 garage) + (inroom sink.n.01_1 storage_room) + (onfloor agent.n.01_1 floor.n.01_1) + ) + + (:goal + (and + (onfloor ?pool.n.01_1 ?floor.n.01_1) + (not + (stained ?pool.n.01_1) + ) + (ontop ?scrub_brush.n.01_1 ?shelf.n.01_1) + ) + ) +) +``` +The `:objects` and `:init` sections specify the initial state as a set of objects and a set of initial atomic formulae the objects must satisfy at the start of the task. The `:goal` section specifies the expression that the objects must satisfy for the task to be considered successfully completed, i.e. what the agent needs to achieve. + +## BDDL language + +BDDL includes two types of files: the **domain** file and the **problem** file. There is one domain file per simulator, and one problem file per activity definition. We use "problem" only to keep consistent with the Planning Domain Definition Language (PDDL), and from hereon will use only "activity definition". In general, BDDL activity definitions are logical expressions using standard logical operators, some custom operators, objects instances as ground terms, object categories as variables, and object properties as predicates. + +### Domain file + +The domain contains three sections: domain name (`(domain )`); requirements (`(:requirements :strips :adl)` as BDDL relies on these); and predicates, a list of predicates with fields indicating their arity. See the example created for iGibson 2.0 [here](https://github.com/StanfordVL/bddl/blob/master/bddl/activity_definitions/domain_igibson.bddl). + +### Activity definition file header + +The pactivity definition is more complex. It consists of an activity definition name, a domain, objects, initial condition, and goal condition. See examples in subdirectories [here](https://github.com/StanfordVL/bddl/tree/master/bddl/activity_definitions). + +By convention, the problem (`problem`) section should take the form of `(problem _)`, where `activity_instance` is some identifying number that distinguishes this definition of `behavior_activity` from other definitions of the same activity, since a BEHAVIOR activity (e.g. "packing lunches" or "cleaning bathtub") can be defined multiple times to make multiple versions. + +### Activity definition file domain + +The domain (`:domain`) section should take the form of `(:domain )`, where `domain_name` matches to the domain `define`d in some domain file. + +### Activity definition file `:objects` section + +The objects (`:objects`) section should contain all object instances involved in the activity definition, categorized. For example, for an activity definition with three instances of some category `mycat`, `:objects` should include the following line: `mycat_1 mycat_2 mycat_3 - mycat`. BDDL requires that object instances be written as `_` where `category` is a WordNet synset (see the **Annotations for activity definition** section for details on the role of WordNet in BDDL). `:objects` should list and categorize every object instance used in the definition. + +### Activity definition file `:init` section + +The initial condition (`:init`) should consist of a list of ground atomic formulae. This means that `:init` cannot contain logical operators such as `and`, `forall`, or `forpairs`, and all objects involved in it must be instances - concrete object instances like `mycat_1` - and not variables that may indicate multiple possible instances (e.g. just `mycat`, which could be any instance with category `mycat`). `:init` can contain certain types of negated atomic formulae (using the `not` logical operator) - specifically, when the atomic formula is **not** involved in location of objects. So, a BDDL activity definition **can** have `(not (cooked mycat_1))` but it **cannot** have `(not (ontop mycat_1 othercat_1))`. This is for the sampler - it is difficult to sample "anywhere-but" efficiently and ecologically. + +The `:init` section has some additional requirements. For this, we note that all objects in BDDL are either **scene objects** or **additional objects**. **Scene objects** are those that are already available by default in the simulator scene that you are instantiating an activity definition in. **Additional objects** are those that will be added to the scene by the sampler. To make sure that the `:init` that a sampler needs to handle is not underspecfied, BDDL requires the following: +- Every scene object involved in the activity definition must appear in exactly one `inroom` atomic formula specifying the room it's in, e.g. `(inroom myscenecat_1 kitchen)`. Note that you do not need to list all of the scene's default objects in your activity definition, you only need to list the ones that are relevant to you. +- Every additional object must appear in a binary atomic formula that specifies its position relative to a scene object either directly or transitively. So, the following is acceptable, because it has a direct positioning: +``` +(ontop myaddcat_1 myscenecat_1) +``` +And this is also acceptable, because it has indirect positioning for some additional objects but they are all ultimately positioned relative to a scene object: +``` +(ontop myaddcat_1 myaddcat_2) +(ontop myaddcat_2 myscenecat_1) +(ontop myaddcat_3 myaddcat_1) +``` +The following is not acceptable because even though all additional objects appear as arguments to a positional predicate, they are not all placed relative to scene objects: +``` +(ontop myaddcat_1 myscenecat_1) +(ontop myaddcat_2 myaddcat_3) +``` + +### Activity definition file `:goal` section + +Finally, the goal condition (`:goal`) should consist of one logical expression, likely a conjunction of clauses. This expression can use any of the standard logical operators used in the [Planning Domain Definition Language (PDDL)](https://planning.wiki/ref/pddl/problem), namely `and`, `or`, `not`, `imply`, `forall`, and `exists`. It can also use our custom operators: `forn`, `forpairs`, and `fornpairs`. These custom operators are defined as follows: +- `forn`: for some non-negative integer `n` and some object category `mycat`, the child condition must hold true for at least `n` instances of category `mycat` +- `forpairs`: for two object categories `mycat` and `othercat`, the child conditiono must hold true for some one-to-one mapping of object instances of `mycat` to object instances of `othercat` that covers all instances of at least one of the two categories +- `fornpairs`: for some non-negative integer `n` and two object categories `mycat` and `othercat`, the child condition must hold true for at least `n` pairs of instances of `mycat` and instances of `othercat` that follow a one-to-one mapping. + +Unlike the `:init` section, where object instances can be used as terms but object categories cannot, `:goal` can use object categories as *bound variables*. A variable must be bound in a quantifier to be concrete and not ambiguous, and in `:goal` we have several quantifiers available: `forall`, `exists`, `forn`, `forpairs`, and `fornpairs`. In the `:goal`, BDDL allows any of these quantifiers to be used to bind categories, to make goal conditions that are more flexible and concise than those that specify object instances everywhere. + +## Annotations for activity definition + +Using BDDL to make activity definition content requires two types of annotations: +1. Annotations of object categories as [WordNet](https://wordnet.princeton.edu/) *synsets* (terms for distinct concepts) that follow the WordNet hierarchy. +2. Annotations mapping object categories to properties they exhibit. + +### Hierarchical object category annotations +The object categories in BDDL are all WordNet synsets. WordNet synsets are single terms for groups of cognitive synonyms, removing the ambiguity coming from multiple words meaning the same thing or one word having multiple senses. In BDDL, each category means exactly what its synset refers to. + +BDDL object categories (such as `mycat` and `othercat` above) have the following syntax: `.n.`. This is a subset of WordNet's synset syntax - the `n` in the middle of the BDDL category syntax indicates "noun". In WordNet, there are a few other options for other parts of speech that are not relevant to BDDL. + +The list of BDDL object categories is found in [`objectmodeling.csv`](https://github.com/StanfordVL/bddl/blob/master/utils/objectmodeling.csv). Note that these are specifically categories that also have at least one 3D model in the BEHAVIOR Object Dataset. + +The synset formulation allows BDDL objects to follow the WordNet hierarchy, meaning that an object will not only be seen as an instance of its own category, but also all parent categories. + +The hierarchy is specified in `hierarchy*.json` files. These are generated by running [`hierarchy_generator.py`](https://github.com/StanfordVL/bddl/blob/master/utils/hierarchy_generator.py); see the script for more details on the format of the various hierarchies generated. + +### Object-to-property mapping +As stated above, objects are terms in BDDL and their properties are predicates. BDDL currently requires support for the set of properties specified in [`domain_igibson.bddl`](https://github.com/StanfordVL/bddl/blob/master/bddl/activity_definitions/domain_igibson.bddl) - all the unary predicates are states of individual objects. + +Of course, not every property applies to every object. Therefore, BDDL requires a mapping of every object to the properties it has. For BDDL's original supported objects, these mappings come from crowdsourced annotations. These annotations are found in [`synsets_to_filtered_properties.json`](https://github.com/StanfordVL/bddl/blob/master/utils/synsets_to_filtered_properties.json). Furthermore, simply mapping is not always enough - some properties require additional parameters. For example, different items may all be cookable and burnable, but cook and burn at different temperatures, so they not only need to be annotated as `cookable` or `burnable`, but also with a `cook_temperature` and `burn_temperature`. `synsets_to_filtered_properties.json` shows the correct syntax for the annotations. + +Finally, these properties are filtered to be inherited through the hierarchy. More concretely, the hierarchy generation script enforces that in `hierarchy*.json`, the properties of any non-leaf category will be the intersection of the properties of all its descendant categories; only the leaf categories' properties are determined by their crowdsourced annotation. Note that `synsets_to_filtered_properties.json` does **not** obey this rule, so you cannot use it directly to determine properties, only to enter and store original annotations. You must use `hierarchy*.json` for determining properties that actually apply in the activity definition. + +### Adding your own object to BDDL + +Adding your own object to BDDL requires providing the above annotations. Do not do so unless you support an appropriate 3D object model in your simulator for the category you are adding. For instructions on how to add an object model to iGibson 2.0, click [here](). + +Once you have a usable object model, do the following: +1. Find the most suitable synset for your object category in WordNet. +2. Add your object's synset, as well as a non-synset label if you want, to `bddl/utils/objectmodeling.csv`. **Note:** if the synset you chose is already in `objectmodeling.csv`, you can stop here! +3. Decide which of the properties in `bddl/bddl/activity_definitions/domain_igibson.bddl` apply to your object and annotate them in `bddl/utils/synsets_to_filtered_properties.json`, using the correct syntax and providing any necessary paramaters. +4. *Only if your simulator definitely supports each property for this object*, edit `bddl/utils/prune_object_property.py` to include your object and all its supported properties, and specify an outfile for your purposes; if you are using iGibson, keep the default value `synsets_to_filtered_properties_pruned_igibson.json`. +5. Run `bddl/utils/hierarchy_generator.py` to generate the various hierarchy JSONs, wihch will now contain your synset associated with its properties. + +## Creating your own activity definition +Use the following steps to create your own BDDL activity. +1. **Choose an activity you want to define,** like "putting away groceries" or "washing dishes". You can choose from the [list of BEHAVIOR activities](https://behavior.stanford.edu/activity-annotation) or you can make your own. +2. **Choose a method of making your definition.** There are multiple ways to do so: + a. Using our visual-BDDL annotator [here](https://behavior.stanford.edu/activity-annotation). This requires no setup and offers helpful constraints, but your definition is not guaranteed to fit in a scene even if it is syntactically correct. + b. Using a local version of our annotation system (download the [code](https://github.com/StanfordVL/behavior-activity-annotator)). This requires more setup, offers helpful constraints, and if you attach a simulator to test sampling (example implementation in iGibson 2.0 [here](TODO)), you can edit your definition until it fits. + c. Writing your own BDDL file directly. This requires no setup, and has neither helpful constraints nor limiting constraints. Your definition is not guaranteed to be syntactically correct or fit in a scene. +3. **Define the activity in BDDL.** + a/b. If you are using our annotation interface, whether locally or online, you will find instructions at the [landing page](https://behavior.stanford.edu/activity-annotation) and more detailed instructions in the interface itself. + c. If you are writing BDDL directly, refer to the earlier section of this page as well as [existing definitions](https://github.com/StanfordVL/bddl/tree/master/bddl/activity_definitions) for guidance. \ No newline at end of file diff --git a/docs/dataset.md b/docs/dataset.md deleted file mode 100644 index 50cc050d..00000000 --- a/docs/dataset.md +++ /dev/null @@ -1,140 +0,0 @@ -Dataset -========================================== - -In dataset we include two parts. First we introduce the new iGibson dataset in this release. Secondly, we introduce - how to download previous Gibson dataset, which is updated and compatible with iGibson. - -- [Download iGibson 2.0 Scenes and Behavior Dataset of Objects](#download-igibson-2.0-scenes-and-behavior-dataset-of-objects) -- [Download iGibson 1.0 Scenes](#download-igibson-1.0-scenes) -- [Download Gibson Scenes](#download-gibson-scenes) - -Download iGibson 2.0 Scenes and Behavior Dataset of Objects -------------------------- - -- iGibson 2.0 Dataset of Scenes: New versions of the fully interactive scenes, more densely populated with objects. -- BEHAVIOR Object Dataset: Dataset of object models annotated with physical and semantic properties. The 3D models are free to use within iGibson 2.0 for BEHAVIOR (due to artists' copyright, models are encrypted and allowed only to be used with iGibson 2.0). You can download a bundle of the iGibson 2.0 dataset of scenes and the BEHAVIOR dataset of objects here. - -To download both in a bundle, you need to follow the following steps: -Request access to the BEHAVIOR assets using this [form](https://docs.google.com/forms/d/e/1FAIpQLScPwhlUcHu_mwBqq5kQzT2VRIRwg_rJvF0IWYBk_LxEZiJIFg/viewform): -- Fill out the license agreement as suggested in the [form](https://docs.google.com/forms/d/e/1FAIpQLScPwhlUcHu_mwBqq5kQzT2VRIRwg_rJvF0IWYBk_LxEZiJIFg/viewform) -- When done, copy the key you receive (igibson.key) into the repository subfolder folder iGibson/igibson/data -- Download the behavior data bundle (ig_dataset) [here](https://storage.googleapis.com/gibson_scenes/behavior_data_bundle.zip). -- Unzip ig_dataset: - - `mkdir iGibson/igibson/data` - - `unzip behavior_data_bundle.zip -d iGibson/igibson/data` - - -Download iGibson 1.0 Scenes ------------------------- - -We annotate fifteen 3D reconstructions of real-world scans and convert them into fully interactive scene models. In this process, we respect the original object-instance layout and object-category distribution. The object models are extended from open-source datasets ([ShapeNet Dataset](https://www.shapenet.org/), [Motion Dataset](http://motiondataset.zbuaa.com/), [SAPIEN Dataset](https://sapien.ucsd.edu/)) enriched with annotations of material and dynamic properties. - -The fifteen fully interactive models are visualized below. - -![placeholder.jpg](images/ig_scene.png) - -#### Download Instruction -To download the dataset, you need to first configure where the dataset is to be stored. You can change it in `your_installation_path/igibson/global_config.yaml` (default and recommended: `ig_dataset: your_installation_path/igibson/data/ig_dataset`). iGibson scenes can be downloaded with one single line: - -```bash -python -m igibson.utils.assets_utils --download_ig_dataset -``` - -If the script fails to work, you can download from this [direct link](https://storage.googleapis.com/gibson_scenes/ig_dataset.tar.gz) and extract to `your_installation_path/igibson/data/ig_dataset`. -#### Dataset Format -The new dataset format can be found [here](https://github.com/StanfordVL/iGibson/tree/master/igibson/utils/data_utils). - -#### Cubicasa / 3D Front Dataset -We provide support for Cubicasa and 3D Front Dataset, to import them into iGibson, follow the guide [here](https://github.com/StanfordVL/iGibson/tree/master/igibson/utils/data_utils/ext_scene). - -Download Gibson Scenes ------------------------- -Original Gibson Environment Dataset has been updated to use with iGibson simulator. The link will first take you to - the license agreement and then to the data. - -[[ Get download link for Gibson Data ]]. - -License Note: The dataset license is included in the above link. The license in this repository covers only the provided software. - -Files included in this distribution: - -1. All scenes, 572 scenes (108GB): gibson_v2_all.tar.gz -2. 4+ partition, 106 scenes, with textures better packed (2.6GB): gibson_v2_4+.tar.gz -3. Demo scene `Rs` - -To download 1 and 2, you need to fill in the agreement and get the download link `URL`, after which you can - manually download and store them in the path set in `your_installation_path/igibson/global_config.yaml` (default and - recommended: `dataset: your_installation_path/igibson/data/g_dataset`). You can run a single command to download the dataset - , this script automatically download, decompress, and put the dataset to correct place. -```bash -python -m igibson.utils.assets_utils --download_dataset URL -``` - -To download 3, you can run: - -```bash -python -m igibson.utils.assets_utils --download_demo_data -``` - - -### Original Gibson Environment Dataset Description (Non-interactive) - - -Full Gibson Environment Dataset consists of 572 models and 1440 floors. We cover a diverse set of models including households, offices, hotels, venues, museums, hospitals, construction sites, etc. A diverse set of visualization of all spaces in Gibson can be seen [here](http://gibsonenv.stanford.edu/database/). - - -![spaces.png](images/spaces.png) - - -#### Dataset Metadata - -Each space in the database has some metadata with the following attributes associated with it. The metadata is available in this [JSON file](https://raw.githubusercontent.com/StanfordVL/GibsonEnv/master/gibson/data/data.json). -``` -id # the name of the space, e.g. ""Albertville"" -area # total metric area of the building, e.g. "266.125" sq. meters -floor # number of floors in the space, e.g. "4" -navigation_complexity # navigation complexity metric, e.g. "3.737" (see the paper for definition) -room # number of rooms, e.g. "16" -ssa # Specific Surface Area (A measure of clutter), e.g. "1.297" (see the paper for definition) -split_full # if the space is in train/val/test/none split of Full partition -split_full+ # if the space is in train/val/test/none split of Full+ partition -split_medium # if the space is in train/val/test/none split of Medium partition -split_tiny # if the space is in train/val/test/none split of Tiny partition -``` - -#### Dataset Format - -Each space in the database has its own folder. All the modalities and metadata for each space are contained in that folder. -``` -mesh_z_up.obj # 3d mesh of the environment, it is also associated with an mtl file and a texture file, omitted here -floors.txt # floor height -floor_render_{}.png # top down views of each floor -floor_{}.png # top down views of obstacles for each floor -floor_trav_{}.png # top down views of traversable areas for each floor -``` - -For the maps, each pixel represents 0.01m, and the center of the image correspond to `(0,0)` in the mesh, as well as in the pybullet coordinate system. - -#### Dataset Metrics - - -**Floor Number** Total number of floors in each model. - -We calculate floor numbers using distinctive camera locations. We use `sklearn.cluster.DBSCAN` to cluster these locations by height and set minimum cluster size to `5`. This means areas with at least `5` sweeps are treated as one single floor. This helps us capture small building spaces such as backyard, attics, basements. - -**Area** Total floor area of each model. - -We calculate total floor area by summing up area of each floor. This is done by sampling point cloud locations based on floor height, and fitting a `scipy.spatial.ConvexHull` on sample locations. - -**SSA** Specific surface area. - -The ratio of inner mesh surface and volume of convex hull of the mesh. This is a measure of clutter in the models: if the inner space is placed with large number of furnitures, objects, etc, the model will have high SSA. - -**Navigation Complexity** The highest complexity of navigating between arbitrary points within the model. - -We sample arbitrary point pairs inside the model, and calculate `A∗` navigation distance between them. `Navigation Complexity` is equal to `A*` distance divide by `straight line distance` between the two points. We compute the highest navigation complexity for every model. Note that all point pairs are sample within the *same floor*. - -**Subjective Attributes** - -We examine each model manually, and note the subjective attributes of them. This includes their furnishing style, house shapes, whether they have long stairs, etc. - diff --git a/docs/index.rst b/docs/index.rst index 19e43f2c..1b989256 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -1,9 +1,9 @@ -.. InteractiveGibsonEnv documentation master file, created by - sphinx-quickstart on Tue Nov 19 14:38:54 2019. +.. BEHAVIOR Domain Definition Language documentation master file, created by + sphinx-quickstart on Sun Nov 14 18:27:54 2021. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. -Welcome to iGibson's documentation! +Welcome to the BDDL documentation! ================================================== .. toctree:: diff --git a/docs/installation.md b/docs/installation.md index 72bd7cb6..98f336ad 100644 --- a/docs/installation.md +++ b/docs/installation.md @@ -1,220 +1,37 @@ -# Installation -There are two steps to install iGibson, the Interactive Gibson Environment, on your computer. +# Installation +Installing BDDL is very simple. -First, you need to install the simulation environment. Then, you need to download the assets: models of the robotic agents, the interactive objects and 3D reconstructed real-world large environments for your agents to train. +## System Requirements -### System Requirements +BDDL requires python 3. It has some required packages which will be installed automatically. It has been tested on: +- Python: >= 3.6 +- Linux: Ubuntu >= 16.04 +- Windows: Windows 10 +- Mac: macOS >= 10.15 -The minimum system requirements are the following: +Given minimal requirements, we expect BDDL to work with most systems that have Python 3. Note that if you are using the [iGibson 2.0](https://github.com/StanfordVL/iGibson) simulator with BDDL, you will have a longer list of requirements to check. -- Linux - - Ubuntu 16.04 - - Nvidia GPU with VRAM > 6.0GB - - Nvidia driver >= 384 - - CUDA >= 9.0, CuDNN >= v7 - - CMake >= 2.8.12 (can install with `pip install cmake`) - - g++ (GNU C++ compiler) - - libegl-dev (Debian/Ubuntu: vendor neutral GL dispatch library -- EGL support) -- Windows - - Windows 10 - - Nvidia GPU with VRAM > 6.0GB - - Nvidia driver >= 384 - - CUDA >= 9.0, CuDNN >= v7 - - CMake >= 2.8.12 (can install with `pip install cmake`) - - Microsoft Visual Studio 2017 with visual C++ tool and latest Windows 10 SDK -- Mac OS X - - Tested on 10.15 - - PBR features not supported - - CMake >= 2.8.12 (can install with `pip install cmake`) +## Installing the library -Other system configurations may work, but we haven't tested them extensively and we probably won't be able to provide as much support as we want. - -## Installing dependencies - -Beginning with a clean ubuntu 20.04 installation, you **must run the following script as root/superuser** (`sudo su`) which will install all needed dependencies to build and run iGibson with CUDA 11.1: - -```bash -# Add the nvidia ubuntu repositories -apt-get update && apt-get install -y --no-install-recommends \ - gnupg2 curl ca-certificates && \ - curl -fsSL https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | apt-key add - && \ - echo "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 /" > /etc/apt/sources.list.d/cuda.list && \ - echo "deb https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 /" > /etc/apt/sources.list.d/nvidia-ml.list - -# The following cuda libraries are required to compile igibson -# Note: the following assumes you will be using nvidia drivers on a headless node -# please use xserver-xorg-video-nvidia-470 if you are on a desktop -apt-get update && apt-get update && apt-get install -y --no-install-recommends \ - nvidia-headless-470 \ - cuda-cudart-11-1=11.1.74-1 \ - cuda-compat-11-1 \ - cuda-command-line-tools-11-1=11.1.1-1 \ - cuda-libraries-dev-11-1=11.1.1-1 \ - -# For building and running igibson -apt-get update && apt-get install -y --no-install-recommends \ - cmake \ - git \ - g++ \ - libegl-dev -``` - -Conda is recommended over standard virtual environments. To setup anaconda with the requisite dependencies, run the following as your user account (**not as root/superuser**): - -```bash -# Install miniconda -curl -LO http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh -bash Miniconda-latest-Linux-x86_64.sh -rm Miniconda-latest-Linux-x86_64.sh - -# Add conda to your PATH -echo "export PATH=$HOME/.miniconda/bin:$PATH" >> .bashrc - -# Update conda and create a virtual environment for iGibson -conda update -y conda -conda create -y -n igibson python=3.8 -conda activate igibson -``` - -By default, iGibson builds with CUDA support which requires that `nvcc` is on your path (or CUDA 11 is symlinked to `/usr/local/cuda` from `/usr/local/cuda-11.1`). Cmake uses `nvcc` to find the CUDA libraries and headers when building iGibson. Add the following to your shell rc (`.bashrc`, `.zshrc`, etc.) and re-login to your shell (`exec bash`, `exec zsh`, etc.): -```bash -export PATH=/usr/local/cuda-11.1/bin:$PATH -``` - -Then verify nvcc is on your PATH: -```bash -$ nvcc --version -nvcc: NVIDIA (R) Cuda compiler driver -Copyright (c) 2005-2020 NVIDIA Corporation -Built on Mon_Oct_12_20:09:46_PDT_2020 -Cuda compilation tools, release 11.1, V11.1.105 -Build cuda_11.1.TC455_06.29190527_0 -``` - -To build without CUDA support (used for the "rendering to GPU tensor" feature), you will have to set `USE_CUDA` to `False` in `iGibson/igibson/render/CMakeLists.txt`. - -## Installing the Environment - -We provide 3 methods to install the simulator. +There are two ways to install BDDL. ### 1. pip iGibson's simulator can be installed as a python package using pip: ```bash -pip install igibson # This step takes about 4 minutes -# run the demo -python -m igibson.examples.demo.demo_static -``` - -### 2. Docker image - -Docker provides an easy way to reproduce the development environment across platforms without manually installing the software dependencies. We have prepared docker images that contain everything you need to get started with iGibson. - -First, install Docker from the [official website](https://www.docker.com/). Please make sure that the docker version is at least v19.0 to enable native GPU support. - -Next, download our pre-built images with the script in the `iGibson` repo: - +pip install bddl ``` -cd iGibson -./docker/pull-images.sh -``` - -Two images will be downloaded: -* `igibson/igibson:latest`: smaller image, but does not support GUI. -* `igibson/igibson-gui:latest`: supports GUI and remote desktop access via VNC. -We also provide scripts to build the images from scratch: -``` -# image without GUI: -cd iGibson/docker/base -./build.sh - -# image with GUI and VNC: -cd iGibson/docker/headless-gui -./build.sh -``` - - -### 3. Compile from source - -Alternatively, iGibson can be compiled from source: [iGibson GitHub Repo](https://github.com/StanfordVL/iGibson). First, you need to install anaconda following the guide on [their website](https://www.anaconda.com/). +### 2. From source ```bash -git clone https://github.com/StanfordVL/iGibson --recursive -cd iGibson - -conda create -n py3-igibson python=3.8 anaconda # we support python 2.7, 3.5, 3.6, 3.7, 3.8 -source activate py3-igibson -pip install -e . # This step takes about 4 minutes +git clone https://github.com/StanfordVL/bddl +cd bddl +python setup.py install ``` -We recommend the third method if you plan to modify iGibson in your project. If you plan to use it as it is to train navigation and manipulation agents, the pip installation or docker image should meet your requirements. - -Note: If you are not using conda, you will need the system packages python3-dev (header files to build Python extensions) and python3-opencv (provides opencv and its dependencies). - -## The SVL pybullet fork -Note: we support using a custom pybullet version to speed up the physics in iGibson. This is installed automatically if you install iGibson. If you already have pybullet installed in your conda environment, you can replace it with our fork as follows: - -```bash -pip uninstall pybullet -pip install pybullet-svl -``` - -## Downloading the Assets - -First, configure where iGibson's assets (robotic agents, objects, 3D environments, etc.) is going to be stored. It is configured in `your_installation_path/igibson/global_config.yaml` - -To make things easier, the default place to store the data is: -```bash -assets_path: your_installation_path/igibson/data/assets -g_dataset_path: your_installation_path/igibson/data/g_dataset -ig_dataset_path: your_installation_path/igibson/data/ig_dataset -threedfront_dataset_path: your_installation_path/igibson/data/threedfront_dataset -cubicasa_dataset_path: your_installation_path/igibson/data/assetscubicasa_dataset -``` - -If you are happy with the default path, you don't have to do anything, otherwise you can run this script: -```bash -python -m igibson.utils.assets_utils --change_data_path -``` - -Second, you can download our robot models and objects from [here](https://storage.googleapis.com/gibson_scenes/assets_igibson.tar.gz) and unpack it in the assets folder, or simply run this download script: - -```bash -python -m igibson.utils.assets_utils --download_assets -``` - - -Third, you need to download some large 3D reconstructed real-world environments (e.g. houses and offices) from [our dataset](dataset.md) for your agents to be trained in. Create a new folder for those environments and set the path in `your_installation_path/igibson/global_config.yaml` (default and recommended: `your_installation_path/igibson/data/g_dataset` and `your_installation_path/igibson/data/ig_dataset`). You can get access and download the Gibson dataset (after filling up the following [license agreement](https://forms.gle/36TW9uVpjrE1Mkf9A)) and the iGibson dataset (following the guide [here](http://svl.stanford.edu/igibson/docs/dataset.html#download-instruction) or following the instructions below). In addition, you can download a single [high quality small environment R's](https://storage.googleapis.com/gibson_scenes/Rs.tar.gz) for demo purposes. - -To download the demo data, run: - -```bash -python -m igibson.utils.assets_utils --download_demo_data -``` - -The full Gibson and iGibson dataset can be downloaded using the following command, this script automatically downloads, decompresses, and puts the dataset to correct place. You will get `URL` after filling in the agreement form. - -Download iGibson dataset -```bash -python -m igibson.utils.assets_utils --download_ig_dataset -``` - -Download Gibson dataset ([agreement signing](https://forms.gle/36TW9uVpjrE1Mkf9A) required to get `URL`) -```bash -python -m igibson.utils.assets_utils --download_dataset URL -``` - -## Testing - -To test igibson is properly installed, you can run -```bash -python ->> import igibson -``` +## Uninstalling -For a full suite of tests and benchmarks, you can refer to [tests](tests.md) for more details. +Uninstalling BDDL is just as simple: `pip uninstall bddl` -## Uninstalling -Uninstalling iGibson is easy: `pip uninstall igibson` diff --git a/docs/intro.md b/docs/intro.md index 85cc9df7..aa1e0917 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -1,39 +1,25 @@ -# iGibson: the Interactive Gibson Environment -### Large Scale Interactive Simulation Environments for Robot Learning +# BDDL: BEHAVIOR Domain Definition Language -iGibson, the Interactive Gibson Environment, is a simulation environment providing fast visual rendering and physics simulation (based on Bullet). It is packed with a dataset with hundreds of large 3D environments reconstructed from real homes and offices, and interactive objects that can be pushed and actuated. iGibson allows researchers to train and evaluate robotic agents that use RGB images and/or other visual sensors to solve indoor (interactive) navigation and manipulation tasks such as opening doors, picking and placing objects, or searching in cabinets. +The BEHAVIOR Domain Definition Language (BDDL) is the domain-specific language of the BEHAVIOR benchmark for embodied AI agents in simulation. BEHAVIOR's 100 activities are realistic, diverse, and complex, and BDDL facilitates data-driven definition of these activities. BDDL is object-centric and based in predicate logic, and can express an activity's initial and goal conditions symbolically. The codebase includes 100 such symbolic definitions and functionality for parsing them and any other BDDL file, including a custom one; compiliing the symbolic definition to be grounded in a physically simulated environment; checking success and progress efficiently at every simulator step; and solving the goal condition to measure finer-grained progress for evaluation. ### Citation -If you use iGibson or its assets and models, consider citing the following publication: +If you use BEHAVIOR, consider citing the following publication: ``` -@misc{shen2021igibson, - title={iGibson 1.0: a Simulation Environment for Interactive Tasks in Large Realistic Scenes}, - author={Bokui Shen and Fei Xia and Chengshu Li and Roberto Martín-Martín and Linxi Fan and Guanzhi Wang and Claudia Pérez-D'Arpino and Shyamal Buch and Sanjana Srivastava and Lyne P. Tchapmi and Micael E. Tchapmi and Kent Vainio and Josiah Wong and Li Fei-Fei and Silvio Savarese}, +@inproceedings{srivastava2021behavior, + title={BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, Ecological Environments}, + author={Sanjana Srivastava* and Chengshu Li* and Michael Lingelbach* and Roberto Martín-Martín* and Fei Xia and Kent Vainio and Zheng Lian and Cem Gokmen and Shyamal Buch and C. Karen Liu and Silvio Savarese and Hyowon Gweon and Jiajun Wu and Li Fei-Fei}, year={2021}, - eprint={2012.02924}, - archivePrefix={arXiv}, - primaryClass={cs.AI} -} -``` - -``` -@misc{li2021igibson, - title={iGibson 2.0: Object-Centric Simulation for Robot Learning of Everyday Household Tasks}, - author={Chengshu Li and Fei Xia and Roberto Martín-Martín and Michael Lingelbach and Sanjana Srivastava and Bokui Shen and Kent Vainio and Cem Gokmen and Gokul Dharan and Tanish Jain and Andrey Kurenkov and Karen Liu and Hyowon Gweon and Jiajun Wu and Li Fei-Fei and Silvio Savarese}, - year={2021}, - eprint={2108.03272}, - archivePrefix={arXiv}, - primaryClass={cs.RO} + booktitle={5th Annual Conference on Robot Learning} } ``` ### Code Release -The GitHub repository of iGibson can be found here: [iGibson GitHub Repo](https://github.com/StanfordVL/iGibson). Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. +The GitHub repository of BDDL can be found here: [BDDL GitHubRepo](https://github.com/StanfordVL/bddl). Bug reports, suggestions for improvement, as well as community developments are encouraged and appreciated. ### Documentation -The documentation for iGibson can be found here: [iGibson Documentation](http://svl.stanford.edu/igibson/docs/). It includes installation guide (including data download instructions), quickstart guide, code examples, and APIs. +The documentation for iGibson can be found here: [BDDL Documentation](https://stanfordvl.github.io/bddl). It includes installation guide, quickstart guide, code examples, and APIs. -If you want to know more about iGibson, you can also check out [our webpage](http://svl.stanford.edu/igibson), [iGibson 2.0 arxiv preprint](https://arxiv.org/abs/2108.03272) and [iGibson 1.0 arxiv preprint](https://arxiv.org/abs/2012.02924). +If you want to know more about BDDL, you can also check out the [BEHAVIOR paper](https://arxiv.org/abs/2108.03332) and [our webpage](https://behavior.stanford.edu/), where you can create your own BEHAVIOR activity using a visual version of BDDL! diff --git a/docs/overview.md b/docs/overview.md index 96a5ea4c..9105025b 100644 --- a/docs/overview.md +++ b/docs/overview.md @@ -1,16 +1,71 @@ # Overview -Next, we will give an overview of iGibson and briefly explain the different modules in our system. -![quickstart.png](images/overview.png) +The BDDL codebase has two primary sections: activity definition and interface with the simulator. Given a simulator and an agent deployed in it, BDDL is typically used to instantiate a BEHAVIOR activity definition in the simulator, then check the agent's progress/success at every step. -First of all, we have **Dataset** and **Assets**. **Dataset** contain 3D reconstructed real-world environments. **Assets** contain models of robots and objects. Download guide can be found [here](installation.html#downloading-the-assets). More info can be found here: [Dataset](dataset.md) and [Assets](assets.md). +## Activity definition -Next, we have **Renderer** and **PhysicsEngine**. These are the two pillars that ensure the visual and physics fidelity of iGibson. We developed our own MeshRenderer that supports customizable camera configuration, physics-based rendering (PBR) and various image modalities, and renders at a lightening speed. We use the open-sourced [PyBullet](http://www.pybullet.org/) as our underlying physics engine. It can simulate rigid body collision and joint actuation for robots and articulated objects in an accurate and efficient manner. Since we are using MeshRenderer for rendering and PyBullet for physics simulation, we need to keep them synchronized at all time. Our code have already handled this for you. More info can be found here: [Renderer](renderer.md) and [PhysicsEngine](physics_engine.md). +### Summary -Furthermore, we have **Scene**, **Object**, **Robot**, and **Simulator**. **Scene** loads 3D scene meshes from `igibson.g_dataset_path, igibson.ig_dataset_path`. **Object** loads interactable objects from `igibson.assets_path`. **Robot** loads robots from `igibson.assets_path`. **Simulator** maintains an instance of **Renderer** and **PhysicsEngine** and provides APIs to import **Scene**, **Object** and **Robot** into both of them and keep them synchronized at all time. More info can be found here: [Scene](./scenes.md), [Object](./objects.md), [Robot](./robots.md), and [Simulator](simulators.md). +BDDL activities are defined by a set of **objects** in a scene, a ground **initial condition** that the scene configuration satisfies when the agent starts the activity, and a **goal condition** logical expression that the scene configuration must satisfy for the agent to reach success. The following example demonstrates this: -Moreover, we have **Task**, **Sensor** and **Environment**. **Task** defines the task setup and includes a list of **Reward Function** and **Termination Condition**. It also provides task-specific reset functions and task-relevant observation definition. **Sensor** provides a light wrapper around **Render** to retrieve sensory observation. **Environment** follows the [OpenAI gym](https://github.com/openai/gym) convention and provides an API interface for external applications. More info can be found here: [Environment](environments.md). +``` +(define + (problem cleaning_the_pool_simplified) + (:domain igibson) -Finally, any learning framework (e.g. RL, IL) or planning and control framework (e.g. ROS) can be used with **Environment** as long as they accommodate OpenAI gym interface. We provide tight integration with **ROS** that allows for evaluation and visualization of, say, ROS Navigation Stack, in iGibson. More info can be found here: [Learning Framework](learning_framework.md) and [ROS](ros_integration.md). + (:objects + pool.n.01_1 - pool.n.01 + floor.n.01_1 - floor.n.01 + scrub_brush.n.01_1 - scrub_brush.n.01 + sink.n.01_1 - sink.n.01 + agent.n.01_1 - agent.n.01 + ) + + (:init + (onfloor pool.n.01_1 floor.n.01_1) + (stained pool.n.01_1) + (onfloor scrub_brush.n.01_1 floor.n.01_1) + (inroom floor.n.01_1 garage) + (inroom sink.n.01_1 storage_room) + (onfloor agent.n.01_1 floor.n.01_1) + ) + + (:goal + (and + (onfloor ?pool.n.01_1 ?floor.n.01_1) + (not + (stained ?pool.n.01_1) + ) + (ontop ?scrub_brush.n.01_1 ?shelf.n.01_1) + ) + ) +) +``` +The `:objects` and `:init` sections specify the initial state as a set of objects and a set of initial atomic formulae the objects must satisfy at the start of the task. The `:goal` section specifies the expression that the objects must satisfy for the task to be considered successfully completed, i.e. what the agent needs to achieve. + +### BDDL language + +BDDL includes two types of files: the **domain** file and the **problem** file. There is one domain file per simulator, and one problem file per activity definition (in this sense, "problem" and "activity definition" are interchangeable). + +The domain contains three sections: domain name (`(domain )`); requirements (`(:requirements :strips :adl)` as BDDL relies on these); and predicates, a list of predicates with fields indicating their arity. See the example created for iGibson 2.0 [here](https://github.com/StanfordVL/bddl/blob/master/bddl/activity_definitions/domain_igibson.bddl). + +The problem, i.e. an activity definition, is more complex. It consists of a problem name, a domain, objects, initial condition, and goal condition. See examples in subdirectories [here](https://github.com/StanfordVL/bddl/tree/master/bddl/activity_definitions). + +By convention, the problem (`problem`) section should take the form of `(problem _)`, where `activity_instance` is some identifying number that distinguishes this definition of `behavior_activity` from other definitions of the same activity, since a BEHAVIOR activity (e.g. "packing lunches" or "cleaning bathtub") can be defined multiple times to make multiple versions. + +The domain (`:domain`) section should take the form of `(:domain )`, where `domain_name` matches to the domain `define`d in some domain file. + +The objects (`:objects`) section should contain all object instances involved in the activity definition, categorized. For example, for an activity definition with three instances of some category `mycat`, `:objects` should include the following line: `mycat_1 mycat_2 mycat_3 - mycat`. BDDL requires that object instances be written as `_` where `category` is a WordNet synset (see the next section for details on the role of WordNet in BDDL). `:objects` should list and categorize every object instance used in the definition. + +The initial condition (`:init`) should consist of a list of ground atomic formulae. This means that `:init` cannot contain logical operators such as `and`, `forall`, or `forpairs`, and all objects involved in it must be instances - concrete object instances like `mycat_1` - and not variables that may indicate multiple possible instances (e.g. just `mycat`, which could be any instance with category `mycat`). `:init` can contain certain types of negated atomic formulae (using the `not` logical operator) - specifically, when the atomic formula is **not** involved in location of objects. So, a BDDL activity definition **can** have `(not (cooked mycat_1))` but it **cannot** have `(not (ontop mycat_1 othercat_1))`. This is for the sampler - it is difficult to sample "anywhere-but" efficiently and ecologically. + +Finally, the goal condition (`:goal`) should consist of one logical expression, likely a conjunction of clauses. This expression can use any of the standard logical operators used in the [Planning Domain Definition Language (PDDL)](https://planning.wiki/ref/pddl/problem), namely `and`, `or`, `not`, `imply`, `forall`, and `exists`. It can also use our custom operators: `forn`, `forpairs`, and `fornpairs`. Note that + +### Annotations for activity definition + + +### Creating your own activity definition + + +## Interface with a simulator -We highly recommend you go through each of the Modules below for more details and code examples. diff --git a/docs/quickstart.md b/docs/quickstart.md index e72c30fb..9638210a 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -1,109 +1,2 @@ # Quickstart -## iGibson in Action -Assume you finished installation and assets downloading. Let's get our hands dirty and see iGibson in action. - -```bash -python -m igibson.examples.demo.env_example -``` -You should see something like this: -![quickstart.png](images/quickstart.png) - -The main window shows PyBullet visualization. The robot (TurtleBot) is moving around with random actions in a realistic house (called "Rs", the one you just downloaded!). - -On the right hand side, you can see two windows from our mesh renderer. The top one (RobotView) shows the robot's first person view. The bottom one (ExternalView) shows the view of a virtual camera floating in the air. - -If you want to have a virtual tour around the house yourself, you can click on the ExternalView window, and then translate the virtual camera to a different location by pressing "WASD" on your keyboard and rotate it to a different angle by dragging your mouse. - -That's it! - -## Using Docker and remote GUI access via VNC - -If you go the docker route, please first pull our pre-built images (see the installation guide). After downloading, run `docker images`, and you should see `igibson/igibson:latest` and `igibson/igibson-gui:latest`. - -On a headless server (such as a Google Cloud or AWS instance), run -``` -cd iGibson -./docker/headless-gui/run.sh -# run a GUI example after the container command line prompt shows: -python simulator_example.py -``` - -On your local machine, you can use any VNC client to visit the remote GUI at `:5900` with the default password `112358`. - -For example, Mac OS X provides a native app called [Screen Sharing](https://support.apple.com/guide/mac-help/share-the-screen-of-another-mac-mh14066/mac) that implements the VNC protocol. - -To change the default port and password (must be 6 digits): - -``` -./docker/headless-gui/run.sh --vnc-port 5903 --vnc-password 654321 -``` - -If you do not need GUI, -``` -./docker/base/run.sh -# run a script after the container command line prompt shows: -python benchmark.py -``` - -## Benchmarks - - -Performance is a big designing focus for iGibson. We provide a few scripts to benchmark the rendering and physics -simulation framerate in iGibson. - -### Benchmark static scene (Gibson scenes) -```bash -python -m igibson.test.benchmark.benchmark_static_scene -``` - -You will see output similar to: -``` -physics simulation + rendering rgb, resolution 512, render_to_tensor True: 421.12805140080695 fps -Rendering rgb, resolution 512, render_to_tensor True: 778.2959856272473 fps -Rendering 3d, resolution 512, render_to_tensor True: 857.2466839793148 fps -Rendering normal, resolution 512, render_to_tensor True: 878.6977946996199 fps - -physics simulation + rendering rgb, resolution 512, render_to_tensor False: 205.68141718250024 fps -Rendering rgb, resolution 512, render_to_tensor False: 265.74379871537326 fps -Rendering 3d, resolution 512, render_to_tensor False: 292.0761459884919 fps -Rendering normal, resolution 512, render_to_tensor False: 265.70666134193806 fps - -``` - -### Benchmark physics simulation in interactive scenes (iGibson scene) - -```bash -python -m igibson.test.benchmark.benchmark_interactive_scene -``` - -It will generate a report like below: - -![](images/scene_benchmark_Rs_int_o_True_r_True.png) - - -### Benchmark rendering in interactive scenes - -To run a comprehensive benchmark for all rendering in all iGibson scenes, you can excute the following command: - -```bash -python -m igibson.test.benchmark.benchmark_interactive_scene_rendering -``` - -It benchmarks two use cases, one for training visual RL agents (low resolution, shadow mapping off), another one for - training perception tasks, with highest quality of graphics possible. - - ```python - 'VISUAL_RL': MeshRendererSettings(enable_pbr=True, enable_shadow=False, msaa=False, optimized=True), - 'PERCEPTION': MeshRendererSettings(env_texture_filename=hdr_texture, - env_texture_filename2=hdr_texture2, - env_texture_filename3=background_texture, - light_modulation_map_filename=light_modulation_map_filename, - enable_shadow=True, msaa=True, - light_dimming_factor=1.0, - optimized=True) - -``` -It will generate a report like below: -![](images/benchmark_rendering.png) -