simovni2:implementation

SimOvni 2 Implementation

AR HMD | VR HMD

  • Framework : Unity3D Project / Unity3d / OpenXR / Android|“Windows”
  • Communication : TCP/IP / Wifi. KISS as much as possible
    • what informations do we need to pass ? parameters updates and play/stop timeline commands.
    • using json structures.

TCP is fast enough. We don't need blazing fast updates when updating the parameters.

Some C# Libraries for that :

    • TCP Client-Server Connection Example | Unity | C# | Bidirectional communication sample
    • Client can connect to server
    • Client can send and receive messages: Server accepts clients
    • Server reads client messages
    • Server sends messages to client
  • Json Reader Writer (used as is) : https://github.com/zanders3/json
    • Attempts to parse JSON files with minimal GC allocation
    • Nice and simple “[1,2,3]”.FromJson<List<int»() API
    • Classes and structs can be parsed too!

laptop PC | phone | tablet

The first prototype apps runs on laptop PC for a fast dev, for more room to place sliders/buttons.

La premiere appli proto, est sur PC portable pour aller beaucoup plus vite, pour avoir plus de place écran pour placer les contrôles en vrac.

Software stack Identical to the witness side.

The advantage of Unity3d is its ability to retarget an API for mobile as well as PC. Mobile : not now. Now we make it run on a laptop PC for a much faster dev.

L'avantage de unity3d est de pouvoir re-targeter l'appli plus facilement ensuite pour mobile. Mais il y a peut-être beaucoup de travail à cause des différences de gestion des boutons ?

Trajectory key-points interpolation:

  • if on investigator side
    • easier to debug and make evolve the logic.
    • big visual jitter guarantied in the HMD (that's the main issue)
  • if on witness side
    • a little bit more complex to debug
    • ensures fluid trajectories

Both are implemented since v0.1.0

Development environment selected

All these softwares are free.

In the Microsoft guidelines linked below, you

  • need
    • Visual Studio. Preferably the same version as mine, to be sure. Visual Studio Community 2019 v16.9.4
      • after installation , you will need to install optional features. Follow the instructions given on the microsoft “install the tools” page linked below.
    • Unity3d version 2020.3.27f1 (you can install it, using the Unity Hub tool)
  • don't need
    • The HoloLens2 simulator/emulator
    • Windows Mixed Reality simulator
    • MixedRealityFeatureTool-1.0.2109.0-Preview is not required because the github project comes with the Mixed Reality Toolkit assets pre-installed. You may need it later when you want to test other features of the HMD.
    • In unity, no need to setup a new project, just open the simovni2 project

Knowing that… Follow the guidelines in

https://docs.microsoft.com/en-us/windows/mixed-reality/develop/install-the-tools (partly specific to HoloLens2)

Source code / Unity Project https://github.com/albion2000/simovni2

But, first you need to install the development environment

When opening the project in unity for the first time, at the end of import, you may have 3 errors (oculus related) and 1 warning (about a duplicate identifier or importer generated inconsistent result for asset) that you can ignore.

the same project can generate

  • the target client application for the HoloLens 2 (simovni2 scene. That scene has the 3 client objects)
  • the server application on PC (simovni2_fastdev scene compiled with the 3 client objects inactive)
  • during development, client+server in the same application (simovni2_fastdev scene with the 3 client objects active) that you can run in the integrated player, or as a standalone exe. The CGI is the client side, the UI overlayed is the server side. It starts very fast. All the coms are logged into the unity console for checks. That way to debug is much faster than using the HoloLens or 2 separate APIs.
  • during development when the client is installed on the HL2, the server application can still be played in unity. (simovni2_fastdev scene with the 3 client objects inactive) that you can run in the integrated player, it can communicate with the HL2.

The trajectory.txt file is loaded/created interactively using the server & client. You can start without any trajectory.txt file and create it from scratch by adding keypoints.

The trajectory file is handled by the server (PC) only.

Things to tune

  • the ip address of the SERVER should be put into the code of TCPServer.cs and TCPClient.cs — replace “192.168.43.121” by your specific ip. (in Assets/Scenes/scripts)
  • When running the server API standalone, that is, outside of the unity editor/debugger, you should copy the trajectory.txt into the build directory (side by side with simovni2.exe)
  • to tune the firewall on the server to accept TCP connections from the HoloLens. Private Network, Accept incoming TCP connections on all ports (even though we use only port 9005, it seems that it is necessary to not only specify explicitly TCP (not “all protocols”) and also open all the ports in order for the TCP connection to succeed).
  • shape of the UAP (png or 3d object)

A quick test to know that it works in unity alone (at least)

  • open the simovni2_fastdev scene ; activate the 3 client objects in the scene graph tree (select on the left, activate then on the right in the “inspector”). Press Play (Button of the Unity IDE). Look at the console at the bottom. You should see the log that the server is connected to a client.
  • then press the buttons [load trajectory], [Show +], [Set North…], [Play] at ten seconds, the UAP should appear.
  • you can ignore the warning “Lock input to game view…”

For tests, on the PC side, you can run in the IDE

That's the version on Github.

Should work with more HMDs than just the HoloLens2

unity3d version 2020.3.27f1

The demo projects come with Mixed Reality Features installed :

  • Mixed Reality Toolkit foundation 2.7.2
  • Mixed Reality Toolkit Standard Assets 2.7.2
  • Mixed Reality OpenXR Plugin 1.1.2

they were integrated in the demos using the tool MixedRealityFeatureTool-1.0.2109.0-Preview

Visual Studio Community 2019 v16.9.4 for the compilation of the project for the HoloLens 2 target

For compiling/building for the HoloLens 2 target :

  • scene to build : scenes/simovni2 only (menu file>build settings. click on [add open scene] if simovni2 is not already selected)
  • Platform to select in Unity3d : Universal Windows Platform. You may need to switch platform. (menu file>build settings)
  • click build in unity → then select/create a (new) directory where the generated files will be produced.
  • after the generation, unity will open a file explorer. Open the (new) directory. Open the new solution simovni2.sln with Visual Studio.
  • Target to select in visual studio : Release / ARM64 / Device / Native Only.
  • Build or rebuild (menu build>build solution). Go take a coffee.
  • Deploy (menu build>deploy solution) from visual studio to the HoloLens by USB typically (or wifi).

A new icon “Simovni2” should then appear in the HL2 3D interface.

I recommend that before each new deploy to the HL2, you uninstall the previous version of simovni2. Worked more often that way for me.

For compiling/building for the PC target (server standalone exec) :

  • scene to build : scenes/fastdev_simovni only (menu file>build settings. With the client objects disabled. Click on [add open scene] if fastdev_simovni is not already selected)
  • Platform to select in Unity3d : PC, MAC. You may need to switch platform. (menu file>build settings)
  • click build in unity → then select/create a (new) directory where the generated files will be produced.
  • after the generation, unity will open a file explorer. Open the (new) directory. You should find the exec file
  • optionally copy a previously created trajectory.txt file along the simovni.exe file
  • press ALT-F4 to leave the API. Press ALT+ENTER to leave full screen.

May only work with the HoloLens2

unity3d version 2019.4.19f1

The demo projects (not on github, not published) come with Mixed Reality Features installed :

  • Mixed Reality Toolkit foundation 2.5.3
  • Mixed Reality Toolkit Standard Assets 2.5.3

they were imported using the tool : MixedRealityFeatureTool-1.0.2109.0-Preview

Visual Studio Community 2019 v16.9.4 for the compilation of the project for the HoloLens 2 target

Target to select in visual studio : Release / ARM64 / Device / Native Only. Deployed from visual studio to the HoloLens by USB typically (or wifi).

A new icon “Simovni2” should appear.

Unity3d Scene Graph

The investigator machine is the TCP server (it is that way just because it is problematic to implement a server on a hololens (protections))

Don't forget to tune the firewall on the server to accept TCP connections from the HoloLens. Private Network, Accept incoming TCP connections on all ports (even though we use only port 9005, it seems that it is necessary to not only specify explicitly TCP (not “all protocols”) and also open all the ports in order for the TCP connection to succeed).

To understand how it works, the best is to look at an example scene in Unity, but here for reference is an short explanation of the reference scene : objects and their hierarchy in 3D.

  • Controller Server object (running the polarControlServer.cs script) : this is the main object . It manages all the parameters under control of the investigator. It manages the trajectory using the trajectory “class” (TrajectoryStructures.cs). It communicates using the Messaging Server object
  • Messaging server object (running the messaging.cs script): It detects changes to the data and sends these changed data to the client, using the TCP Server object.
  • TCP Server object (running the TCPServer.cs script): manages low level communication (bytes)
  • HoloHUD : contains just the sliders, buttons and toggles
  • Main Camera : this is the virtual camera that matches in real time the position and orientation of the HMD. Position and orientation is expressed relative to the startup pose (pos & orient). It is updated at low frequency for low overhead.
    • Sights : this is a yellow cross at the center of the view that can be shown or hidden. That can be used in order to point a direction, even without using hand tracking or eye tracking.

Buttons

time line control
  • play
    • while playing, all the parameters are under control of all the keypoints
  • pause
    • one can/should edit keypoints only in pause. In pause, one can move the sliders, create a new keypoint, overwrite a previous one, delete one
  • rewind
  • loop
    • if active, will make the trajectory loop when played
  • prev keypoint
    • jumps instantly to the previous closest keypoint in time
  • next keypoint
    • jumps instantly to the next closest keypoint in time
for editing while in pause
  • create or overwrite keypoint
    • if already at a keypoint (that is, if time exactly matches the time of one of the keypoints already created), this will overwrite it,
    • else it will create/insert an additional one using the current time, and the current values of all the parameters.
  • remove keypoint
    • if already at a keypoint (that is, if time exactly matches the time of one of the keypoints already created), it will remove it.
  • load (trajectory)
  • save (trajectory)
  • send trajectory
    • sends the current trajectory to the HMD. This is automatically done when going into “remote control” mode. Use of this button is not really necessary, since it is automatically done when going into “remote control” mode
  • set north using witness direction
    • one should point the geographical north
    • this is essential to first do this on site, so that all captured directions are correctly referenced to the geographical north.
  • get witness sighting direction
    • take the current direction of the HMD as direction of the UAP, typically in order to later create a keypoint in polar mode

Toggles

  • polar/cartesian control mode
  • control angular sise
    • in this mode, the angular size slider controls the angular size of the object. If you move the slider, it will change the distance of the UAP, keeping its size constant. If you change the distance, the size will be adjusted. If you change the size, the distance will be adjusted.
    • out of this mode, the angular size slider cannot be moved because it is updated based on the distance and size of the UAP.
  • drawing/'3D model' mode
  • billboard mode
    • in this mode, the UAP always faces the witness. Useful when using the drawing mode
  • show/hide sights (in the witnesses HMD)
  • remote control
    • you are supposed to use this mode only when you have finished editing the trajectory and are ready for a smooth playback
    • in this mode, the trajectory playback is computed by the HMD, not by the investigator device. This ensures a smooth trajectory replay.
    • in this mode, you should only play/pause/rewind.
    • in this mode, one cannot edit the trajectory.
    • going into this mode starts by sending the trajectory keypoints to the HMD

Sliders

movable while playing and in pause
  • time (current time on trajectory in seconds, starting from 0)
    • if moved while playing, allows to move back and forth along the recorded trajectory in a fast way
    • if moved while in pause, the behavior is completely different. It is for trajectory editing, with the intend to create a new key-point at a new time; starting with the current settings when pause was pressed (alt, az, roll, size, brightness, etc…).
movable only while in pause for trajectory editing
  • in polar control mode
    • azimuth
    • altitude (angle over the horizon)
    • distance
  • in cartesian control mode
    • position x, y, z
  • brightness (to be expressed at some point in candela/m²)
  • roll
  • angular size
  • size

  • Controller Client (running the polarControlClient.cs script): this is the main object. It manages all the parameters under control of the witness and received data updates from the investigator. It communicates using the Messaging Client object
  • Messaging Client (running the messaging.cs script): It sends any detected changes to the data toward the server. It communicates with the server using of the TCP Client object. It is the same source file as for the server.
  • TCP Client (running the TCPClient.cs script): manages low level communication (bytes)
  • Main Camera : this is the virtual camera that matches in real time the position and orientation of the HMD. Position and Orientation is expressed relative to the startup pose (pos & orient).
    • Sights : this is a yellow cross at the center of the view that can be shown or hidden. That can be used in order to point a direction with the head, even without using hand tracking or eye tracking.
  • North : this is an object that contains the current direction of the North relative to the direction of the HMD on startup. It can be updated by a click on a button.
    • NorthDirectionT : shows a red T in the direction of the North.
    • PolarPanYawPitch : that is the object that follows the direction alt/az of the UAP in polar mode.
      • DistanceZ : object that follows the distance to the UAP in polar mode.
        • QuadbaseP : intermediate object that is optionally able to place the UAP always facing the witness
          • PanP3D : this object allows for a local rotation and scaling of the UAP (evolves in real time)
            • Model3DP : this is the 3d model of the UAP. It should be scaled and centered in unity so that it precisely fits into a cube of 1m side (static). The object CheckSize already placed at 0.5m can help for that.
          • PanP2D : this is the object that displays the image of the UAP (See field named Sprite in the sprite renderer section in the inspector). (scale and roll evolve)
            • The image has to be a 512×512 pixels png image. The UAP should be drawn at the center of the image. To be imported as a sprite (2D&UI) under the name UAP.png with an import setting of 512 pixels per unit in unity.
    • Cartesian : that is the object that follows the position of the UAP in cartesian mode
      • QuadbaseC : intermediate object that is optionally able to place the UAP always facing the witness
        • PanC3D : this object allows for a local rotation and scaling of the UAP (evolves in real time)
          • Model3DC : this is the 3d model of the UAP. It should be scaled and centered in unity so that it precisely fits into a cube of 1m side (static). The object CheckSize already placed at 0.5m can help for that.
        • PanC2D : this is the object that displays the image of the UAP (See field named Sprite in the sprite renderer section in the inspector). (scale and roll evolve)
          • The image has to be a 512×512 pixels png image. The UAP should be drawn at the center of the image. To be imported as a sprite (2D&UI) under the name UAP.png with an import setting of 512 pixels per unit in unity.

HMD Types

  • simovni2/implementation.txt
  • Last modified: 2025/01/31 22:54
  • by laurentc