diff --git a/build-with/index.html b/build-with/index.html new file mode 100644 index 0000000..95aaef9 --- /dev/null +++ b/build-with/index.html @@ -0,0 +1,304 @@ + + +
+ + + ++If you haven't already, be sure to sign up for the Developer Newsletter, to learn when there are important changes that may affect you. +
++Thank you for participating! +
++This information is necessarily incomplete: there are always new devices being supported by OSVR, and not all development happens in a centralized location or is brought to the attention of the broader OSVR community. +If you notice an omission, you can submit a pull request for +the file/repo +that fetches the data to generate this page. +
++ +If you notice an error, let us know via the support portal, or by clicking the "Help" button in the lower-right corner of the page. + +
++OSVR has been officially ported to run on the following operating systems: +
+OSVR has plugins that bring its wealth of devices and peripherals into many engines: +
++This table only lists those displays whose display descriptor JSON ships with OSVR-Core or is otherwise known to the wider OSVR community. +Creating such a descriptor for an arbitrary display is straightforward, meaning this list is nowhere near all-encompassing. +
++Note that due to a temporary technical limitation, displays with multiple modes of operation are listed multiple times in the table, once for each mode. +
+Vendor | +Model | +Version | +Notes | +
Vuzix | +Wrap 1200dx | +1.0 | ++ +(Data source: +display descriptor +) + | +
Vuzix | +IWear 720 | +1.0 | ++ +(Data source: +display descriptor +) + | +
OSVR | +HDK | +1.2 | ++Suitable for HDK 1.0-1.2 +(Data source: +display descriptor +) + | +
OSVR | +HDK | +1.3 | ++Specific to the optics of 1.3, with Render Manager compatible distortion parameters +(Data source: +display descriptor +) + | +
OSVR | +HDK | +1.3 | ++Specific to the optics of 1.3, with a specialized distortion correction requiring apps to use RenderManager 0.6.40 or newer. +(Data source: +display descriptor +) + | +
OSVR | +HDK | +2.0 | ++OSVR HDK 2.0 +(Data source: +display descriptor +) + | +
Oculus | +Rift | +DK1 | ++ +(Data source: +display descriptor +) + | +
Oculus | +Rift | +DK2 | ++ +(Data source: +display descriptor +) + | +
Sensics | +dSight | ++ | +1920x1080 landscape-mode video, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +dSight | ++ | +1920x1080 landscape-mode video, 2 video inputs +(Data source: +display descriptor +) + | +
Sensics | +dSight | ++ | +1080x1920 portrait-mode video, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +dSight | ++ | +1080x1920 portrait-mode video, 2 video inputs +(Data source: +display descriptor +) + | +
Sensics | +xSight 6123 | ++ | +Assumes xSight SVP is set for 1920x1080 full screen +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x1024 resolution mode, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x1024 resolution mode, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x1024 resolution mode, 2 video inputs +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x720 resolution mode, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x720 resolution mode, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight | +1.1 | ++1280x720 resolution mode, 2 video inputs +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920-100 | ++ | +1920x1080, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920-100 | ++ | +1920x1080, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920-100 | ++ | +1920x1080, 2 video inputs +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920 | ++ | +1920x1080, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920 | ++ | +1920x1080, 1 video input +(Data source: +display descriptor +) + | +
Sensics | +zSight 1920 | ++ | +1920x1080, 2 video inputs +(Data source: +display descriptor +) + | +
Vrvana | +Totem | +1 | ++ +(Data source: +display descriptor +) + | +
FOVE | +FOVE 0 | ++ | + +(Data source: +display descriptor +) + | +
HTC | +Vive/Vive PRE | ++ | +This is a sample descriptor file only - please follow the instructions to use the ViveDisplayExtractor to generate a correct one for your device! +(Data source: +display descriptor +) + | +
LaputaVR | +Hero | +1.0 | ++Specific to the optics of LaputaVR HMD, with a specialized distortion correction mesh requiring apps to use RenderManager 0.6.40 or newer. +(Data source: +display descriptor +) + | +
Device Vendor | +Device Name | +Interface Classes Implemented | +Other | +
OSVR | +Hacker Development Kit (HDK) Integrated IMU Tracker | +
+
|
+
+
|
+
Razer | +Hydra motion controller | +
+
|
+
+
|
+
YEI | +3Space Sensor | +
+
|
+
+
|
+
Generic | +OpenCV Video Capture | +
+
|
+
+
|
+
3Dconnexion | +SpacePilot Pro 3D Mouse | +
+
|
+
+
|
+
3Dconnexion | +Space Explorer 3D Mouse | +
+
|
+
+
|
+
3Dconnexion | +Space Navigator 3D Mouse | +
+
|
+
+
|
+
3Dconnexion | +Space Navigator 3D Mouse for Notebooks | +
+
|
+
+
|
+
3Dconnexion | +Space Traveller 3D Mouse | +
+
|
+
+
|
+
3Dconnexion | +Spaceball 5000 | +
+
|
+
+
|
+
3Dconnexion | +Spacemouse Plus XT | +
+
|
+
+
|
+
3Dconnexion | +Spacemouse Pro | +
+
|
+
+
|
+
3Dconnexion | +Spacemouse Wireless | +
+
|
+
+
|
+
Fifth Dimension Technologies | +5DT Data Glove 14 Ultra | +
+
|
+
+
|
+
Fifth Dimension Technologies | +5DT Data Glove 16 | +
+
|
+
+
|
+
Fifth Dimension Technologies | +5DT Data Glove 5 Ultra | +
+
|
+
+
|
+
Advanced Realtime Tracking GmbH | +Flystick | +
+
|
+
+
|
+
Performance Designed Products | +Afterglow Ax1 Controller for XBox 360 | +
+
|
+
+
|
+
Arrington Research | +ViewPoint EyeTracker | +
+
|
+
+
|
+
Ascension Technology Corporation | +Flock of Birds | +
+
|
+
+
|
+
Atmel Corporation | +ATmega32 | +
+
|
+
+
|
+
BG Systems, Inc. | +CerealBox | +
+
|
+
+
|
+
Bauhaus University Weimar | +Inertia Mouse | +
+
|
+
+
|
+
CH Products | +Fighterstick USB | +
+
|
+
+
|
+
Contour Design, Inc. | +ShuttleXpress | +
+
|
+
+
|
+
Crossbow Technology, Inc. | +RGA300CA | +
+
|
+
+
|
+
Dream Cheeky | +Roll-up Drum Kit USB | +
+
|
+
+
|
+
Futaba | +InterLink Elite Controller | +
+
|
+
+
|
+
Generic | +GPS (serial NMEA protocol) tracker | +
+
|
+
+
|
+
Global Haptics | +Orb | +
+
|
+
+
|
+
Griffin Technology | +PowerMate Controller | +
+
|
+
+
|
+
Hayden-Kerk | +IDEA PCM4806X Motor Controller | +
+
|
+
+
|
+
Hillcrest Labs | +Freespace | +
+
|
+
+
|
+
In2Games | +Gametrak | +
+
|
+
+
|
+
InterSense | +IS-900 MicroTrax Hand tracker | +
+
|
+
+
|
+
InterSense | +IS-900 MicroTrax Head tracker | +
+
|
+
+
|
+
InterSense | +IS-900 MicroTrax Wand | +
+
|
+
+
|
+
LogiCad 3D | +Magellan controller | +
+
|
+
+
|
+
Logitech | +Extreme 3D Pro Joystick | +
+
|
+
+
|
+
Solution Technologies, Inc. | +MicroScribe 3D | +
+
|
+
+
|
+
Mindtel LLC | +Totally Neat Gadget (TNG) 3 | +
+
|
+
+
|
+
Motion Workshop | +MotionNode | +
+
|
+
+
|
+
Northern Digital Inc | +Polaris Spectra | +
+
|
+
+
|
+
Northern Digital Inc | +Polaris Vicra | +
+
|
+
+
|
+
National Instruments | +DAQCard DIO 24 | +
+
|
+
+
|
+
Nintendo | +WiiMote | +
+
|
+
+
|
+
Novint Technologies Inc | +Falcon | +
+
|
+
+
|
+
Oculus VR | +Rift DK1 | +
+
|
+
+
|
+
Oculus VR | +Rift DK2 | +
+
|
+
+
|
+
Origin Instruments | +DynaSight | +
+
|
+
+
|
+
PNI Sensor Corporation | +SpacePoint Fusion | +
+
|
+
+
|
+
PhaseSpace | +Motion Capture | +
+
|
+
+
|
+
Polhemus | +3Space | +
+
|
+
+
|
+
Polhemus | +G4 | +
+
|
+
+
|
+
Polhemus | +Isotrak tracker | +
+
|
+
+
|
+
Polhemus | +LIBERTY | +
+
|
+
+
|
+
Polhemus | +LIBERTY LATUS | +
+
|
+
+
|
+
Polhemus | +PowerTRAK 360 | +
+
|
+
+
|
+
Retrolink USB | +Classic GameCube-style Controller | +
+
|
+
+
|
+
Retrolink USB | +Classic Genesis Controller | +
+
|
+
+
|
+
Saitek | +ST290 Pro Joystick | +
+
|
+
+
|
+
Microsoft | +SideWinder Precision (raw driver) | +
+
|
+
+
|
+
Microsoft | +SideWinder Precision 2 (raw driver) | +
+
|
+
+
|
+
TRIVISIO Prototyping GmbH | +Colibri | +
+
|
+
+
|
+
Thalmic Labs | +Myo | +
+
|
+
+
|
+
US Digital | +A2 Absolute Rotary Encoder | +
+
|
+
+
|
+
VR-Space | +WinTrackerIII | +
+
|
+
+
|
+
Vinten Radamec | +Serial Port Interface (SPI) | +
+
|
+
+
|
+
P.I. Engineering | +X-keys Desktop USB | +
+
|
+
+
|
+
P.I. Engineering | +X-keys Jog and Shuttle | +
+
|
+
+
|
+
P.I. Engineering | +X-keys Joystick | +
+
|
+
+
|
+
P.I. Engineering | +X-keys Professional USB | +
+
|
+
+
|
+
P.I. Engineering | +X-keys XK-12 Jog and Shuttle | +
+
|
+
+
|
+
P.I. Engineering | +X-keys XK-12 Joystick | +
+
|
+
+
|
+
P.I. Engineering | +X-keys XK-3 | +
+
|
+
+
|
+
P.I. Engineering | +X-keys XK-68 Jog and Shuttle | +
+
|
+
+
|
+
Microsoft | +X-Box Controller 360 (and compatible - raw driver) | +
+
|
+
+
|
+
Microsoft | +X-Box Controller S (and compatible - raw driver) | +
+
|
+
+
|
+
Yost Labs (YEI) | +3-Space Sensor | +
+
|
+
+
|
+
Zaber | +T-LA linear actuator | +
+
|
+
+
|
+
Leap Motion | +Leap Motion Controller | +
+
|
+
+
|
+
Hillcrest Labs | +FSM-9 Tracker | +
+
|
+
+
|
+
Nod | +Backspin | +
+
|
+
+
|
+
HTC | +Vive PRE and Vive Controllers | +
+
|
+
+
|
+
SMI | +Tracker | +
+
|
+
+
|
+
FOVE | +Fove Eye Tracker Plugin | +
+
|
+
+
|
+
FOVE | +Fove Head Tracker Plugin | +
+
|
+
+
|
+
Oculus VR | +Oculus Rift Tracker/Sensors (via Oculus SDK) | +
+
|
+
+
|
+
OSVR | +StandardFirmata | +
+
|
+
+
|
+
Oculus VR | +Oculus Rift Trackers/Sensors (via OpenHMD) | +
+
|
+
+
|
+
Microsoft | +Kinect for Windows | +
+
|
+
+
|
+
Microsoft | +Kinect for Xbox ONE | +
+
|
+
+
|
+
Nintento | +Wiimote | +
+
|
+
+
|
+
Ximmerse | +Ximmerse Outside/In Controllers | +
+
|
+
+
|
+
OSVR | +Comprehensive Plugin Example | +
+
|
+
+
|
+
OSVR | +Comprehensive Plugin Example | +
+
|
+
+
|
+
LaputaVR | +Hero | +
+
|
+
+
|
+
+The + +OSVR waffle.io board + +contains an overview of issues currently in GitHub issue trackers for all OSVR framework projects. +It summarizes the issues in a number of lists: +
++Of course, the issue lists are not all-encompassing: if you've got a contribution you'd like to make, we'd love to see it! +Filing an issue on the right project would be a great first step. +
++Note that in case you can't find the project you're looking for below, you can access a +full list of projects in the OSVR organization +(note that this link includes a filter to exclude projects forked by OSVR). +
+The core libraries, applications, and plugins of the OSVR software platform. Includes osvr_reset_yaw, which allows setting '0 heading', osvr_print_tree that exports the path tree and osvr_json_to_c which converts JSON files into C headers
+ +Catch-all project for issues and information not specific to a single repo.
+ +Provides distortion correction, time warp, direct mode (on Windows), overfill and oversampling services.
+ +HTC Vive (HMD and controllers) plugin for OSVR.
+ +Oculus Rift tracking plugin for OSVR.
+ +OSVR plugin for trackers of the ANTVR HMDs.
+ +LaputaVR Hero plugin for OSVR.
+ +OSVR plugin for Sensics emulated binoculars and other optical devices.
+ +OSVR plugin for InertialLabs tracker.
+ +OSVR plugin for Ximmerse outside/in controllers.
+ +Alternatve OSVR plugin for PS Move.
+ +OSVR plugin for PS Move.
+ +OSVR plugin providing Oculus Rift DK1 & DK2 orientation tracking via OpenHMD.
+ +An OSVR plugin providing joint position and orientation data from the Kinect.
+ +An OSVR plugin providing analog and digital data from connected Arduinos running the StandardFirmata firmware.
+ +An OSVR plugin providing up to four Wii Remote + Nunchuk devices via the wiiuse library.
+ +OSVR plugin for LYRobotix Nolo VR tracker system.
+ +An OSVR plugin that creates trackers by combining different sources of data.
+ +Package for authoring OSVR experiences with Unity.
+ +Integration of OSVR with the Unreal Engine.
+ +An OSVR plugin for SteamVR, providing SteamVR support for OSVR HMDs.
+ +Managed code (.NET) wrapper for OSVR ClientKit.
+ +Drivers and related code/data for improving the hardware experience for the HDK on Windows.
+ +Automated build system and submodules to compile the OSVR framework for Android.
+ +Additional files used to generate an NDK-BUILD compatible OSVR Android build.
+ +A set of samples demonstrating basic OSVR usage on Android.
+ +An Android app that launches the OSVR server.
+ +A collection of CMake scripts that may be useful to the Android NDK community.
+ +Configuration utility for OSVR
+ +Utility for viewing OSVR tracking data.
+ +Tray utility for configuring OSVR.
+ +Tool for determining distortion parameters of arbitrary HMDs, and a corresponding set of shaders to correct that distortion.
+ +Windows installer for OSVR server application built with NSIS.
+ +A program that displays all devices connected to your OSVR Server, and the data that they are providing.
+ +Browser-based tool for editing OSVR JSON files, based on JSON Schema.
+ +Interface specifications and device integration proposals.
+ +Boxstarter self-installer scripts to prepare user or developer environments using Chocolatey.
+ +A minimal library for dynamically-loaded or statically-linked functionality modules.
+ +OSVR is available for a number of operating systems:
+ + + +Note: full instructions for building OSVR from scratch on Ubuntu 14.4 can be found here
+ +Prerequisites. Many of the prerequisites may be available in your Linux distributions repositories. If you prefer to install them manually, the list of prerequisites follows:
+ +Acquire the source code. Check out the source code from the OSVR-Core repository.
Create a build directory. To keep the source repository clean of temporary and generated build files, create a separate directory to contain the build. We usually create a directory named build
inside the OSVR-Core directory.
$ mkdir build
+$ cd build
+
Generate a Makefile. Run CMake to generate a Makefile. Set the CMAKE_INSTALL_PREFIX
variable to the location where you would like the OSVR files to be installed:
$ cmake .. -DCMAKE_INSTALL_PREFIX=~/osvr
+
Compile OSVR. Navigate to the build directory you created in step 3 and run make
to build OSVR. Optionally run make install
to install the OSVR programs, libraries, and sample configuration files.
$ make
+$ make install
+
If you need further assistance with installing OSVR, email us at support@osvr.com
.
You may install OSVR for OS X using our Homebrew repository:
+ +$ brew tap OSVR/osvr
+$ brew install osvr-core --HEAD
+
+
+Prerequisites. The prerequisites may be installed using Homebrew. If you prefer to install them manually, the list of prerequisites follows:
+ +Acquire the source code. Check out the source code from the OSVR-Core repository.
Create a build directory. To keep the source repository clean of temporary and generated build files, create a separate directory to contain the build. We usually create a directory named build
inside the OSVR-Core directory.
$ mkdir build
+$ cd build
+
Generate a Makefile. Run CMake to generate a Makefile. Set the CMAKE_INSTALL_PREFIX
variable to the location where you would like the OSVR files to be installed:
$ cmake .. -DCMAKE_INSTALL_PREFIX=~/osvr
+
Compile OSVR. Navigate to the build directory you created in step 3 and run make
to build OSVR. Optionally run make install
to install the OSVR programs, libraries, and sample configuration files.
$ make
+$ make install
+
If you need further assistance with installing OSVR, email us at support@osvr.com
.
You may download precompiled binaries of OSVR for Windows from our snapshots page.
+ +Prerequisites. The prerequisites may be installed using the Boxstarter scripts found in the OSVR-Boxstarter Repository. If you prefer to install them manually, the list of prerequisites follows:
+ +Acquire the source code. Check out the source code from the OSVR-Core repository.
Create a build directory. To keep the source repository clean of temporary and generated build files, create a separate directory to contain the build. We usually create a directory named build
inside the OSVR-Core directory.
Generate a Microsoft Visual Studio project. Run CMake to generate a project file for your compuiler. Set the CMAKE_INSTALL_PREFIX
variable to the location where you would like the OSVR files to be installed.
Compile OSVR. Navigate to the build directory you created in step 3 and load the project file. Build the ALL_BUILDS
project and optionally "build" the INSTALL_FILES
project.
If you need further assistance with installing OSVR, email us at support@osvr.com
.
Running the OSVR server just requires passing it a configuration file:
+ +$ osvr_server osvr_server_config.json
+
+
+If you need further assistance with OSVR, email us at support@osvr.com
.
A number of utility programs have been developed for testing and debugging OSVR.
+ +The OSVR Tracker Viewer is a utility program which displays the positions and orientations of one or more tracker devices.
+ +While the OSVR server is running, you may run the osvr_print_tree
program to emit the path tree, showing the detected devices and associated aliases.
If you need further assistance with OSVR, email us at support@osvr.com
.
The OSVR Tracker Viewer is a utility program which displays the positions and orientations of one or more tracker devices.
+ +Prebuilt binaries of OSVR Tracker Viewer are available for Windows. +For other operating systems, check the app store or software repository. Otherwise, instructions for building from source follow.
+ +Install the prerequisites. Building the OSVR Tracker Viewer has the following prerequisites:
+ +Checkout the source code. Check out the OSVR Tracker Viewer source code from the git repository.
Run CMake. Run CMake and set the CMAKE_PREFIX_PATH
to point to the installation directories of OSVR-Core and OpenSceneGraph.
Build OSVR TrackerViewer. Run make
and make install
in Linux and OS X. In Windows, open the generated project file in Microsoft Visual Studio and build the project.
Running OSVR Tracker Viewer will display the following trackers (if available) by default:
+ +/me/hands/left
/me/hands/right
/me/head
If you wish to see other trackers, you may provide their paths on the command line:
+ +OSVRTrackerView --pose /me/head
+OSVRTrackerView --orientation /me/head
+
+
+Additional command-line options and usages are shown with the --help
option.
You can zoom by dragging with the right mouse button or scrolling the scroll wheel. Clicking and dragging with the left button rotates your view of the data. Note that the tool starts up showing you the standard OSVR coordinate system and orientation, so rotation is rarely useful. Pressing the spacebar will reset the view to its default orientation.
+ +The large set of axes is the world coordinates in OSVR—the viewer loads in the standard orientation. Each smaller set of axes is a tracker.
+ +As is convention, the x-axis is red, y-axis is green, and z-axis is blue (xyz-RGB).
+ +If you need further assistance, email us at support@osvr.com
.
ClientKit.dll
is in your plugins folder..unitypackage
file. When you want to test your game, ensure your devices are plugged in and start the OSVR server before running your game.If you need further assistance with integrating OSVR with your Unity game, email us at support@osvr.com
.
+ OSVR is an open-source software platform for virtual and augmented reality. It allows discovery, configuration and operation of hundreds of VR/AR devices and peripherals. OSVR supports multiple game engines, and operating systems and provides services such as asynchronous time warp and direct mode in support of low-latency rendering. OSVR software is provided free under Apache 2.0 license and is maintained by Sensics. +
+Please consider supporting the OSVR movement by +donating to OSVR +. +
+ + ++ Includes new presentation from February '16 +
+These guidelines are based on the mozilla.org Forum Guidelines, used and modified under a CC BY-SA Unported 3.0+ license. Note that civil community conduct is of the utmost importance, and these guidelines are not intended to be taken as the only bounds on acceptable conduct.
+ +There are a few ground rules for participation in community communication (mailing lists, issues, etc) in the OSVR project. Offenses may result in removal of the conduct from archives, if applicable, and can result in a temporary or permanent ban of the offending member from the community. Please respect these rules, and each other.
+ +No personal attacks. Do not feel compelled to defend your honor in public. Personal attacks and/or harassment of any kind are not tolerated, and have consequences as discussed above.
+ +Community members are busy people, so please pay attention to the topic of your messages, and check that it still relates to the charter of the forum (issue, mailing list, etc) to which you are posting. Off-topic discussion not taken to private email or any place where it is not considered off-topic, by someone who knows they should be taking it elsewhere, may be removed from archives, with patterns of such behavior resulting in more serious consequences as discussed above..
+ +Newcomers may be annoying. They ask the wrong questions, including ones that seem obvious (or whose answers seem easy to find). But lots of valued contributors start out this way, and treating newcomers kindly makes them more likely to turn into the valuable community members we all know and love (and cut some slack when they mess up).
+ +So while you don’t have to humor them or suffer them gladly, and it’s fine to point out when they make mistakes, point newcomers in the right direction in addition to turning them away from the wrong ones, and be kind to them in the process of correcting their transgressions.
+ +It’s tempting to revisit controversial decisions you disagree with, but it’s rarely productive to do so, since it almost always results in the same heated, lengthy, and time/energy draining discussions leading to the same conclusion that was reached in the last round.
+ +Therefore, for issues already raised, discussed, and decided upon, reopen the discussion only if you have significant new information that would reasonably prompt reconsideration of the original decision.
+ +It is almost never appropriate to send the same message to two mailing lists, forums, or newsgroups. Please don’t do it.
+ +These community venues are for discussions about the OSVR software ecosystem and OSVR HDK design. As such, discussions about which operating system is better, or whether one toolkit is better than another, or whether (insert VR entity here) is the root of all evil, are not relevant. There are many forums for discussing such issues on the internet; please have such discussions there instead of in the OSVR community venues.
+ +Spam is a blight upon the face of the net. Nobody likes it. However, it is hard to avoid. Despite our best efforts, you will occasionally see spam on the OSVR mailing lists and newsgroups. If you feel the need to flame the spammer, do not CC the list. Complaining about spam in public increases noise, but not signal. It may make you feel better, but it doesn’t help. (For info on fighting spam effectively, check out spam.abuse.net.)
+ +Do not send binary attachments, including screen shots, and especially including screen shots of textual dialog boxes. Many people read these messages through slow network connections; try to be respectful of them. If you have a large file that you would like to distribute, put it on a Web page and announce the URL instead of attaching it.
+ +Do not quote the entire content of the message to which you are replying. Include only as much as is necessary for context. Remember that if someone wants to read the original message, they can; it is easily accessible. A good rule of thumb is, don’t include more quoted text than new text.
+ +There is always a need for some trimming - either a salutation, a signature, some blank lines or whatever. If you are doing no trimming whatsoever of the quoted text, then you aren’t trimming enough.
+ +Some people like to put reply after the quoted text, some like it the other way around, and still some prefer interspersed style. Debates about which posting style is better have led to many flame wars in the checkered history of the internet. To keep forum discussion friendly, please do interspersion with trimming (see above for trimming rules). For a simple reply, this is equivalent to bottom-posting. So, remove extraneous material, and place your comments in logical order, after the text you are commenting upon.
+ +Keep in mind that not everyone uses mail or news readers that can easily display HTML messages. Consequently, you will reach a larger audience if you post in plain-text. Many people simply ignore HTML messages, because it takes a nontrivial amount of effort to read them.
+ +If you encounter a bug, please take the time to file a report in the appropriate GitHub issues tracker about it. OSVR developers do not all have time to follow all the OSVR-related forums on a regular basis, and if you just post a bug report to a forum or mailing list then it may not reach anyone who can actually do anything about the bug. By reporting the bug through GitHub you ensure that it will receive a higher level of attention, and will be tracked along with other bugs.
+ +Not everyone has time reading all forum postings. To ensure that your message reach the right people at timely manner, identify your subject matter clearly in the subject line. Subjects like "a question" and "OSVR problem" are not very helpful.
+ +Unfortunately, this bears repeating. Find out more about unsubscribing in the Mailing Lists section.
+ +Please do not send test messages to the mailing lists.
+ +devel
is not support.OSVR is a large effort with many potential audiences. There are specific channels for "support" requests - see support.osvr.com - that get different kinds of attention than the development discussion media. When deciding whether a message is a support request or suitable for a devel
post, err on the side of sending it to support: support will let you know if your question might be better suited to another venue.
These are low-volume announcement lists, recommended for those using OSVR.
+Provide developer chat rooms on a variety of topics such as OSVR-Core, OSVR-Unity, SteamVR-OSVR and more.
+See the Documentation and Community section
+ +We expect that community members participating in mailing lists, as well as other community venues (forums, issue trackers, etc.), will hold themselves to a high standard of conduct, so that our community actively encourages participation from a wide audience. Please see our etiquette guidelines for basic ground rules.
+The OSVR project currently has one main development discussion mailing list.
+ + +Originally presented by Yuval Boger (CEO, Sensics, Inc.) as an invited tech talk in the CONVRGE social VR platform, April 2015.
+ +OSVR - open source virtual reality - is a popular open-source project that includes both a wide-field open-source virtual reality goggle as well as a free and open-source software framework. This presentation, created by Sensics - founding contributor to OSVR - introduces the key components of OSVR.
+ + + + + +Originally presented by Ryan A. Pavlik (Senior Software Engineer, Sensics, Inc.) as an invited tech talk in the CONVRGE social VR platform, April 2015.
+ +++ + + + + +The OSVR (Open Source Virtual Reality) framework is a fully open-source framework facilitating the connection between virtual reality hardware, algorithms, and applications. I presented this talk in my role as senior software engineer at Sensics, the founding contributor of OSVR.
+ +Originally presented in CONVRGE on 19 April 2015, this talk gives a birds-eye view of the architecture of OSVR, then proceeds to a deep dive into the "path tree", semantic names, and aliases, from the perspective of the OSVR Core.
+
A talk presented by Yuval Boger (CEO, Sensics, Inc.) at the Boston VR Meetup in July 2015.
+ + + +July 2015
+ +Covers the "ground-up" details of writing an OSVR-Core client application or a new engine integration using the OSVR ClientKit API. (More low-level than required to use most engine integrations.)
+ +July 2015
+ +Discusses the current porting status of the OSVR software framework, as well as the steps taken to design for portability. Includes discussion of an early Android deployment plan, though this plan continues to evolve.
+ +July 2015
+ +An overview of the PluginKit API of OSVR-Core suitable for writing device or other plugins.
+ +July 2015
+ +Summarizes a number of "developmental stages" seen in the wild and in academic/industrial research with VR APIs (not naming names!) and their associated, varying levels of hardware abstraction. Discusses the concept of "factoring" devices into generic interfaces and guidelines used in the OSVR project.
+ +This can serve as an introduction to the OSVR Core and Path Tree presentation.
+ +August/September 2015
+ +This slide deck contains information, a number of annotated screenshots, as well as links, to walk you through getting started using Event Tracing for Windows and the OSVR custom providers/events.
+ +Viewing the slides in PDF format is recommended so you can zoom in and see the screenshots clearly.
+ +Presentation given by Yuval Boger (CEO, Sensics, Inc.) at the Unity Vision 2016 summit. Discussed OSVR software framework as a cross-platform middleware to support display, input and output devices in VR and AR applications
+ + + +To add a presentation:
+ +Come up with a "stub" to identify the presentation - this takes the form YYYYMMDD-some-words-separated-by-hyphens
. It will be used all over in filenames, etc. - where you see stub
below, replace it with the contents of the stub. The date should be the date the presentation was given or created (some liberties may be taken there), the words should vaguely be related to the title.
Create a folder named after the stub, and export the presentation (with speaker notes, if you have them) to a PDF named stub.pdf
in this directory.
Create a thumbnail: Typically, export (or screen-cap, if you must) the first slide of the presentation, and put it in /source/images/stub.png
. Don't worry about size - the build process will resize for use as a thumbnail. (I say "typically" because in case of a presentation with video, you might want to capture a still from the video rather than the first slide.)
skip_image: true
to the entry in the presentation index page.Create the file /source/presentations/stub/index.html.md
as the "presentation post" - see below.
Update the file /source/presentations/index.html.erb
- the "presentation index page" - with a new entry, see below.
It's recommended for the person who created/gave the presentation to upload their slides to SlideShare.
+ +The file /source/presentations/stub/index.html.md
starts from this template:
---
+
+title: Enter the title here
+
+# slideshare_embed: HhzKtg9HoVwgmt
+# slideshare_url: http://www.slideshare.net/rpavlik/intro-to-etw-tracing-and-osvr-52175205
+
+# Uncomment the following line if your PDF has speaker notes in it.
+# has_speaker_notes: true
+
+layout: presentation
+
+---
+Date (month is enough) and description goes here.
+
+This is the "body" of the page, and can be any markdown or HTML you like. (For instance, you can paste a YouTube embed snippet here.
+
+
+Fill in the title in the front matter, leave the layout as presentation
(that's what will automatically create PDF links and SlideShare embeds, etc), and fill in the body of the page - the part that follows the second ---
. The body is formatted as Markdown, use that knowledge as you like.
If you've uploaded the presentation to SlideShare, you can uncomment the two slideshare lines in the front-matter and fill them in:
+ +slideshare_url
should just be the link right to the presentation on their site - remove any query-strings, etc. (Shouldn't be any ?
or #
in the URL, and often not any numbers unless you re-upload).slideshare_embed
is the embed key, a little harder to get: on the SlideShare page, click "Share", then copy and paste the embed snippet into a text editor. The very beginning of the snippet should begin something like <iframe src="//www.slideshare.net/slideshow/embed_code/key/HhzKtg9HoVwgmt"
... - you want to grab just the last component of that URL to put as the slideshare_embed
value (in this case, that's HhzKtg9HoVwgmt
)If your PDF added to this web site includes speaker notes, be sure to uncomment the appropriate line in the "front matter" - this will change the text of the links.
+ +If your presentation relates to other presentations, be sure to link (in both directions). A presentation post's "permalink" is http://osvr.github.io/presentations/stub/
, or relative to another presentation post, ../otherstub/
.
The file /source/presentations/index.html.erb
links to (essentially) all presentations, in reverse-chronological (newest first) order, grouped into sections by month.
A given month section starts with a line like this:
+ +<%= presentation_index_section 'September 2015' do |s|
+
+
+which opens a block (here, titled "September 2015") that contains one or more entries like this, one for each presentation:
+ + s.entry stub: '20150901-Intro-ETW-OSVR',
+ description: 'Slide deck with instructions and annotated screen shots showing how to analyze your VR system and software with ETW.'
+
+
+and finally closes with a simple
+ +end %>
+
+
+If you're adding a new section, you'll need to add all three parts (which just directly follow each other), while if you're adding a presentation to an existing section, you'll just add a s.entry
chunk.
The parameters for the entry
function used there are pretty simple, and almost all required. stub
is the stub, and description
is a brief description to show on the link button. The corresponding image will automatically be added to the entry (on non-small-screens), except if you add , skip_image: true
to the entry
call. (This should be considered a short-term measure: it looks much better with a thumbnail, but this will let you preview things without one first.) The title of the link will be pulled from the front-matter of the presentation post.
And yes, at least on Windows, the developer-mode thumbnails don't show up right. Rest assured, if the page actually renders (even though it'll have broken images), it'll deploy correctly to the live web site.
+ ++The lessons of several generations of "hardware-independent" VR APIs, the current state of the art, and the careful task of "factoring" a device into interfaces. +
++Principles of the OSVR ClientKit API for writing a new client application or engine integration. +
++Principles of the OSVR PluginKit API for writing device/other plugins. +
++A discussion of current, future, and potential for cross-platform OSVR portability. +
++A general introduction into the hardware, software, and community of Open Source Virtual Reality - OSVR. +
++A technical presentation on the core of the OSVR software framework, particularly the design and function of the "Path Tree" concept. +
+Yuval Boger1 and Ryan A. Pavlik2, December 2014, revised March 2015
+ +OSVR™ is an open-source software platform for VR/AR applications.
+ +OSVR provides an easy and standardized way to discover, configure and operate hundreds of devices: VR goggles, position trackers, depth cameras, game controllers and more. OSVR supports multiple operating systems, plugs into leading game engines and is freely available under a permissive Apache 2.0 license.
+ +OSVR can be extended with open- or closed-source plugins. Plugins can provide support for new devices or add analysis capabilities such as a gesture engine, sensor fusion and data logging and eye tracking.
+ +OSVR was started by experts in gaming and virtual reality and is supported by an ever-growing list of hardware vendors, game studios, universities and software companies.
+ +This white paper provides a high-level overview of the motivation behind OSVR, its structure and key attributes.
+ +Twenty years ago, if you wanted to print a document from WordPerfect, you needed to install the corresponding WordPerfect driver for your printer. Then, operating systems (such as Windows) introduced a standardized printer abstraction layer so that any Windows application could print to any printer that had a Windows driver.
+ +OSVR standardizes the interface between input devices, games and output devices. It provides abstraction layers for VR devices and peripherals so that a game developer does not need to hardcode support for particular hardware. Instead, just like a Windows application prints to the Windows print services, the game developer connects to the OSVR abstraction layer. If a new VR goggle was introduced in 2016, an OSVR-based game published in 2015 could support this new goggle as soon as the goggle had an OSVR driver.
+ +To a game developer, OSVR provides interfaces – pipes of data – as opposed to an API tied to a specific piece of hardware. If there are multiple devices that provide the same type of information (for instance: hand position), these devices can be interchanged. Today, you might get hand position from a Microsoft Kinect. Tomorrow, you might get hand position from a Razer Hydra. You can reconfigure the OSVR "plumbing" so that the game can continue to work well regardless of where hand position is coming from. With OSVR, game developers can focus on what they want to do with the data, as opposed to how to obtain it.
+ +OSVR lets you mix and match hardware and software packages. For instance, if you use an eye tracking camera, you might use the software provided by the camera vendor to calculate gaze direction, but you might also use alternative gaze detection packages. This means that companies or research groups that focus on a particular software or hardware component (e.g. gaze detection module or eye tracking camera) are not left out of the VR eco-system: their expertise can be interconnected with components from others.
+ +For game and application developers OSVR reduces risk:
+ +For hardware manufacturers OSVR:
+ +For software developers OSVR provides the ability to:
+ +For end users OSVR provides freedom and choice, promotes innovation through its open design and eliminates the dependency on any single vendor.
+ +The diagram below shows the conceptual architecture for OSVR.
+ +Applications written on game engines can interact with OSVR via dedicated game engine plugins. Alternatively, an application can directly access the OSVR "ClientKit" interface.
+ +Analysis plugins are software modules that convert data from lower-level device plugins into higher-level information. For instance, a gesture engine plugin can convert a stream of XYZ coordinates to a recognized gesture.
+ +Device plugins connect to physical devices and expose interfaces – pipes of data – to higher layers. For instance, a plugin for a VR goggle can expose a display interface as well as an orientation tracker interface corresponding to an embedded head-tracker. Many device types and dozens of devices are supported.
+ +The adaptation layer provides OS-specific implementation (e.g. Android vs. Windows vs. iOS) as well as allows OSVR devices to be accessed through a network interface such as WiFi or Bluetooth.
+ +The management layer stores and loads system and user-specific configuration locally and in the cloud. It also provides services to download device plugins on-demand, detect when software updates are available and other utility functions.
+ +Source code and well-defined interfaces are provided for all these components. Device and analysis plugins can easily be added by interested parties, and examples of how to do so are provided as part of the OSVR source-code distribution.
+ +Prior to explaining how applications interface with OSVR and how to write plugins for OSVR, we need to touch on a few OSVR technical concepts:
+ +A device is a physical entity such as an orientation sensor or the Razer Hydra controller.
+ +An interface is a pipe of data. A device exposes one or more interfaces. For instance, a Razer Hydra controller exposes several interfaces:
+ +An interface is an instance of an interface class. An interface class defines properties that can be set or queried as well as a set of events that the class generates. A property might be the last obtained XYZ position from an XYZ position interface. An event could be the press of a particular button in a button set interface.
+ +A plugin is a software module that can be dynamically identified, loaded and connected to OSVR. Drivers contained in plugins implement interface classes. There are two types of drivers in plugins:
+ +From the OSVR perspective, both types of plugins are identical, and the distinction is made for human consumption.
+ +OSVR maintains a "path tree" – similar to a URL or file system path – in which all the sensing and rendering data is made available. Aliases are configured in the server to essentially redirect from a semantic path (a path with a meaningful name) all the way back to the system-specific hardware details. Thus, while direct device access by name is possible, it is not recommended: instead, we recommend accessing semantic paths. This accommodates use cases where the hardware is not immediately available at startup or is changed during operation without any impact to the application developer. Some examples:
+ +/me/hands/left
com.osvr.bundled.Multiserver
plugin: /com_osvr_bundled_Multiserver/RazerHydra0/position/0
/org_example_smoothing/smooth_filter/0
Just like a game allows mapping of various buttons to various game actions, OSVR allows defining the connection between interfaces, analysis plugins and actions. For instance:
+ +/joystick/button/1
→ /actions/fire
maps the first joystick button into a fire action. While the game could choose to access /joystick/button/1
directly, it is recommended to access /actions/fire
in this example because this allows changing the flow of information from the hardware through the OSVR layers without changing the game itself./com_osvr_bundled_Multiserver/RazerHydra0/position/0
→ /org_example_smoothing/smooth_filter/0
→ /me/hands/left
specifies that the position of the first Hydra controller goes through a smoothing filter and then is mapped to the left hand.The connection between interfaces can be pre-loaded or can be changed dynamically. 3
+ +The list of available resources is specified in JSON files as part of the OSVR distribution and is part of the OSVR documentation.
+ +An application can communicate with a plugin in two ways:
+ +An application can use either or both methods. For instance, some applications much choose to query the orientation tracker state in the main loop using a synchronous call whereas they might register for callbacks on certain game controller button presses. Please see examples in the OSVR documentation.
+ +There are three pieces of a minimal application:
+ +These are illustrated in the message state diagram below:
+ +Of course, such an application doesn't really use OSVR in a very productive sense. A sample flow of a more realistic application would also:
+ +Probably before the main loop starts:
+ +During main loop:
+ +During application shutdown, shutdown the library (providing the client context).
These are illustrated in the message state diagram below:
+ +Device and analysis plugins are dynamically loaded by OSVR. Plugins provide support for new types of hardware or analysis functions such as gesture engines.
+ +The OSVR documentation contains example plugins. If you are a hardware or software developer that wants to create an OSVR plugin, please contact us for additional details.
+ +To support a new game engine beyond those already supported in OSVR, the best place to start is "OSVR for game developers" since from an OSVR perspective, a game engine is an application that uses OSVR rather than an OSVR plugin.
+ +As an open-source project, the OSVR community will have a very strong influence on future directions. At present, planned improvements include the following:
+ +We are excited to work with you and see what we can build together with OSVR!
+ + + +Ryan A. Pavlik1, April 27, 2015
+ +In order to provide both proven performance and wide compatibility, portions of OSVR use and extend the VRPN (Virtual Reality Peripheral Network) software and device model. Early releases of OSVR provided an experimental way to integrate external VRPN trackers into OSVR, but the 0.2 release removed that experimental support in favor of complete support for using tracker, button, and analog devices from an external (local or remote) VRPN server.
+ +The change was required to integrate these devices into the metadata-rich "path tree" model used by OSVR. Native OSVR device drivers provide not only access to the device data, but also a JSON "device descriptor" that describes the device capabilities and the semantic meaning of the numbered sensors. As standard VRPN servers lack this additional data, using a VRPN device in OSVR requires that the user supply this information, through the configuration file, to properly integrate the device.
+ +As mentioned above, the methods described in this note require a version of OSVR-Core 0.2 or newer. Both client (application) DLLs and the server package must be 0.2 or newer, as that release broke protocol compatibility for metadata to enable greater functionality.
+ +With regards to the external VRPN server, it may be any 07.XX release (newer preferred, particularly for button devices), running locally or remotely. (This support does includes devices that integrate the VRPN server as their native reporting protocol, as long as the protocol is version 7.)
+ +If you run a vrpn_server
process on the same machine as the OSVR server, you will have to pass an alternate port number (such as 3884
) as a command line argument to vrpn_server
to avoid collision with the embedded VRPN server in OSVR. Configuration of such an external server is beyond the scope of this document: it is assumed that you know the device name (often something like Tracker0
) and the server (commonly just a hostname like trackerserver
, but may be hostname and port localhost:3884
for the suggested local vrpn_server
, or even include transport tcp://trackerserver:3884
). These two parts are typically specified with the @
symbol separating them, e.g. Tracker0@localhost:3884
.
For the purposes of this document, we'll assume you are using some device similar in functionality to a tracker wand or a Razer Hydra. That is, a device that provides tracking, button, and analog data on a single name. This process may be repeated to add any number of external VRPN device names to the system.
+ +You may wish to consult or start with the osvr_server_config.externalvrpn.sample.json
sample config file as you create your own. The documentation below walks through construction of this config file section by section.
A text editor with support for JSON can be very helpful in editing these configuration files, for syntax checking, code folding, and automatic indentation.
+ +The first step is to create the path tree node representing your external device, with the device name, server, and descriptor data. This is done in a top-level element of the config file (osvr_server_config.json
by default) object called externalDevices
. A sample excerpt showing this section follows.
"externalDevices": {
+ "/myExternalDevice": {
+ "deviceName": "Tracker0",
+ "server": "localhost:3884",
+ "descriptor": /* can also provide a file path here, or this is the minimal */ {
+ "interfaces": {
+ "tracker": {},
+ "button": {},
+ "analog": {}
+ }
+ }
+ }
+ }
+
+
+The key externalDevices
refers to a JSON object, where each key is the "path" where a device node will be created, and the value is an object containing the information required to create that node. (In the case of a native OSVR device, this path would have two levels: first the plugin name then the device name, like /com_osvr_Multiserver/OSVRHackerDevKit0
. Once this externalDevices
section is set up, you'll be able to refer to your external device just as you would a native OSVR device.)
In the example above, we've arbitrarily chosen /myExternalDevice
as the path. The deviceName
key contains our VRPN device name (Tracker0
), while the server
key contains the server info (localhost:3884
). The last element of the object, labeled descriptor
, is the metadata that OSVR makes use of but that VRPN does not provide. On the most basic level, it can be what is shown here: just an embedded JSON object with an interfaces
member, that contains an object with members for each interface type you want to access. Instead of an object, you could also supply a string, which would be interpreted as a filename containing a JSON device descriptor. This is useful particularly if you're using the same device in multiple places: you can share the device descriptor file separately from the server config.
Variations of the minimal descriptor embedded and described above is sufficient to follow the rest of these instructions and use your device in OSVR. As such, further explanation of the full device descriptor format is beyond the scope of this document. However, if you wish (for instance, if you might distribute your descriptor file), you can provide a fully-featured device descriptor like those embedded in OSVR plugins. Some examples of full device descriptors follow:
+ + + +You may use the OSVR JSON Editor web app to help you compose this section: it is a single-page application that uses the device descriptor JSON Schema to automatically generate an editor interface.
+ +At this point, you may stop and test your config file. You should see a line resembling the following when you run osvr_server
with your config file:
[OSVR Server] External devices found and parsed from config file.
+
+
+Running the osvr_print_tree
utility should also show you something like this:
[ DeviceElement] /myExternalDevice
+ - corresponds to Tracker0@localhost:3884
+[InterfaceElement] /myExternalDevice/analog
+[InterfaceElement] /myExternalDevice/button
+[InterfaceElement] /myExternalDevice/tracker
+
+
+If you see the DeviceElement
lines, but not the InterfaceElement
lines, then there was an issue parsing your descriptor and it didn't find an interfaces
section. You might find more information in the output of osvr_server
.
The OSVR system strongly discourages the use of hardware-specific paths, and instead recommends your application use "semantic paths" that can be set up to point to different hardware resources on different systems. (In older versions of OSVR, you may have seen mention of "routes" - that is the old name for the same concept. The old routes-based config file syntax is deprecated, with the new aliases syntax preferred.) On devices with OSVR plugins, their device descriptor usually sets up a /semantic
tree underneath the device node, and may also include an automaticAliases
section that provides suggestions for global alias paths. Unless you've added all these features to the device descriptor you provided in step 1, you'll have to set up the appropriate aliases in the config file.
Common paths include:
+ +/me/head
- for a head tracker/me/hands/left
- for a hand tracker or a device (wand, etc) held in the left hand./me/hands/right
- similarThere is also a convention of placing "controller" inputs (buttons, triggers, joysticks) under /controller
, and further /controller/left
and /controller/right
when the inputs are so associated with a tracker. Fewer specific paths are "well-known" in this portion of the path tree, and you are encouraged to create and use other semantic paths as aliases (/actions/jump
as an alias for /controller/a
) in your application and configuration.
The aliases, like the external devices, are configured with a JSON object in the config file, this time under the key aliases
. An excerpt might look like this:
"aliases": {
+ "/controller/trigger": "/myExternalDevice/analog/0",
+ "/controller/a": "/myExternalDevice/button/0",
+ "/me/head": {
+ "rotate": {
+ "axis": "-x",
+ "degrees": 180
+ },
+ "child": {
+ "changeBasis": {
+ "x": "x",
+ "y": "-z",
+ "z": "-y"
+ },
+ "child": "/myExternalDevice/tracker/0"
+ }
+ },
+ "/me/hands/right": {
+ "rotate": {
+ "axis": "-x",
+ "degrees": 180
+ },
+ "child": {
+ "changeBasis": {
+ "x": "x",
+ "y": "-z",
+ "z": "-y"
+ },
+ "child": "/myExternalDevice/tracker/1"
+ }
+ }
+ }
+
+
+This example shows two different syntaxes. The simplest, used for analog and button devices, as well as for trackers that do not need any transformation applied to align with the OSVR global coordinate system, is to simply have the alias path to create as the key, and the path the alias points to as the string value. The first two entries in the example above take this form.
+ +The second syntax (used by the second two entries) is more complex because it enables application of a transformation tree to tracker data. The key, as before, is the alias path to create. However, in this case, the value is a JSON object. This object can have arbitrary levels of nesting, with each internal level specified by the key child
. The final level must terminate with a child
key whose value is a string: the path that the alias transforms and points to. Each level may contain transformation objects (like changeBasis
and rotate
shown here) - see other example files and/or separate documentation on supported transformations.
In all cases, the alias terminates at a path under the device path you set up in step 1. The syntax is /devicename/interfacename/sensornumber
(or occasionally /devicename/interfacename
for all sensors). Of course, if you set up a device descriptor that contains semantic paths, you can reference those instead of the raw device path in this step.
At this point, you can run osvr_server
again with your config file. The server output will not change, but theh output of running osvr_print_tree
will reflect the changes to the path tree that your aliases made. For the example aliases above, you'll see something like this:
[ DeviceElement] /myExternalDevice
+ - corresponds to Tracker0@localhost:3664
+[InterfaceElement] /myExternalDevice/analog
+[ SensorElement] /myExternalDevice/analog/0
+[InterfaceElement] /myExternalDevice/button
+[ SensorElement] /myExternalDevice/button/0
+[InterfaceElement] /myExternalDevice/tracker
+[ SensorElement] /myExternalDevice/tracker/1
+[ SensorElement] /myExternalDevice/tracker/0
+[ AliasElement] /controller/a
+ -> /myExternalDevice/button/0
+[ AliasElement] /controller/trigger
+ -> /myExternalDevice/analog/0
+[ AliasElement] /me/hands/right
+ -> {"child":{"changeBasis":{"x":"x","y":"-z","z":"-y"},"child":"/myExternalDevice/tracker/1"},"rotate":{"axis":"-x","degrees":180}}
+[ AliasElement] /me/head
+ -> {"child":{"changeBasis":{"x":"x","y":"-z","z":"-y"},"child":"/myExternalDevice/tracker/0"},"rotate":{"axis":"-x","degrees":180}}
+
+
+There are a few changes to notice. First, you'll see there are now AliasElement
entries for each alias we configured, and the default settings of osvr_print_tree
enable printing the target of each alias, whether a string or the compact version of your JSON transform. Additionally, each path representing a sensor that is a valid target of an alias that you've specified is now explicitly mentioned as a SensorElement
. These sensor elements are automatically generated when resolving aliases. If you don't see a SensorElement
you were trying to target, then you've likely made a mistake in specifying the corresponding alias.
As a further test of any tracker aliases you've set up, use the OSVR Tracker Viewer application (distributed separately from the core/server - click here for binary downloads for the tracker viewer). Run it with the -h
command line argument to see how to specify which paths in the path tree you want to visualize. The application opens in the standard OSVR coordinate system, so you can verify that your transformations are correct.
Ryan A. Pavlik, PhD is a senior software engineer at Sensics. ↩
+