Newer
Older
***This README is taken from https://gitlab.eecs.umich.edu/umich-dnng/dnng-hololens-visualization. For more information about the difference between that project and this one, read my documentation [here](https://docs.google.com/document/d/1KaTxj4kSwnwNWMsDt_m22U01gDY-sTMNzXvTO43ifFU/edit?pli=1) or check out the file "Fall 2023 Useful Documentation.pdf".***
***For even more information about new features, check out this presentation [here](https://docs.google.com/presentation/d/1sknvq-36w_JeLga8Q-VlBArCInwPHfvG2oIda9D2c7g/edit?usp=sharing) or check out the file "DNNG Fall Presentation.pdf".***
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
Visualizes radiation data from the H2DPI (Handheld Dual-Particle Imager), into an easy to access radiation visualization in AR using the HoloLens 2.
Example use cases include: detecting radiation leakage in a facility, national security and preventing the smuggling of nuclear products.
## Usage
### Features
- *Instructions*: See program introduction and capabilities.
- *Toggle Display*: Toggle virtual monitor(located above the H2DPI dotted outline) to view cone image data.
- *Accumulation*: View cone image data one by one or overlaid together.
- *Neutron Mode*: View either neutron or gamma hotspots in real time (default is gamma).
- *Chart*: Displays a histogram of calculated neutron energies.
- *Advanced*: Toggle different data viewing settings.
- *Lines*: Show/hide directional rays pointing to where the source is located.
- *Mesh*: Show/hide the automatically generated spatial mesh by the HoloLens.
- *Cycle Colormap*: Cycle through different colormaps - cividis, inferno, plasma, viridis.
- *Cone Image*: View generated cone image from H2DPI data.
- *Fast mode*: Uses machine learning and the relative count rates bars within the H2DPI to provide a quick estimate of the azimuthal direction of the source.
- Voice commands are supported for easier access to hand all menu content.
### Tools Needed
- Acquisition Computer
- HoloLens 2
- H2DPI
- Printed QR code (content doesn't matter)
- Optional: Microsoft HoloLens App (HoloLens livestream purposes)
- Download link: https://apps.microsoft.com/store/detail/microsoft-hololens/9NBLGGH4QWNX?hl=en-us&gl=US
### Steps
1. Place a QR code on a flat surface on top of the H2DPI box, such that the center of the QR code is above the center of the H2DPI.
2. Launch the dnng-hololens-visualization app in the HoloLens 2 app library.
3. Place the virtual H2DPI by scanning a QR code (look at the QR code with the HoloLens). This aligns a virtual blue box outlining the aluminum box containing the physical H2DPI
4. Run ComPASS and the Python executable on the acquisition computer.
5. Rotate your palm up on either hand to open up the hand-menu and access various settings.
6. For taking pictures or videos, either open up the HoloLens menu by tapping on either wrist, and clicking the camera or record button, or manually take pictures or videos through the Microsoft HoloLens app (Please note the HoloLens has to be connected to view photos and videos on the app).
*HoloLens device info*:
- HoloLens pin: 110192
- HoloLens IP address: 35.3.201.144
- HoloLens device portal username: HololensDNNG
- HoloLens device portal password: Hololensmostlysunny
## Project Structure
The data pipeline goes in this order:
`Compass Data Files --(Read by)--> Data Parser (Rust) --(Called via Python subprocess)--> send_cones.py --(Over the network)--> Unity`
## ComPASS
The H2DPI detects radiation and interfaces with ComPASS to produce binary data files on the acquisition computer.
### Data Parser
The data parser reads the binary data files produced by ComPASS, parses them into single-coincidences, and then double-coincidences. The double-coincidences are used to infer cones. These cones are then sent to STDOUT as JSON.
There is also an in-development C++ version of the data-parser, in the `data-parser-cpp` folder.
#### Building
The data parser is written in Rust. To build + install a Rust program, first install Rust (via: https://rustup.rs/). Then use `cargo` (Rust's package manager, similar to pip) to run the project.
Assuming you have both Rust and cargo installed, to build the data parser:
1. Navigate to ./data-parser/
2. Run `cargo build --release`
3. The build output should be located at ./data-parser/target/release/data_parser.exe
More resources about Rust and cargo:
- https://doc.rust-lang.org/book/
- https://doc.rust-lang.org/cargo/
### send_cones.py
The `send_cones.py` script runs the data parser executable using Python's built-in `subprocess` utility. It then sends the parsed data over the network to the HoloLens. The HoloLens IP address is stored in the `endpoint` variable.
The python script was built using these libraries/versions:
- python (3.8.10)
- tensorflow (2.5.0)
- numpy
- requests
### Unity
The Unity app does simple backprojection using the cones it receives, and then projects the angular data onto the 3D world (using the spatial mesh generated by the HoloLens), allowing the user to see the direction and relative intensity of radiation sources.
#### Building
The project was built using Unity version 2020.3.31f/MRTK 2 (which was suggested by the HoloLens documentation at the time) and Visual Studio 2019. Note that the Unity project can only be built on Windows.
After importing the project into Unity, to build:
1. Click 'File -> Build Settings...'
- Do not click 'Build and Run' as this will try to run on your local computer, not the HoloLens.
2. Configure with platform 'Universal Windows Platform' and the following settings:
- Target Device: HoloLens
- Architecture: ARM64
- Latest SDK
- Build Configuration: Master
The build process will generate a Visual Studio project. Open the project using Visual Studio 2019. In Visual Studio, select Master build, and ARM64.
Make sure to connect the HoloLens to the internet and configure the HoloLens IP into the 'Machine name' field (see the linked resources for more info). The HoloLens needs to be turned on and connected to the same network (i.e. MWireless) to receive builds and data.
Also see:
- https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/unity-development-overview
- https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/build-and-deploy-to-hololens