forked fromKeckCAVES/SARndbox
- Notifications
You must be signed in to change notification settings - Fork0
Augmented reality application scanning a sand surface using a Kinect 3D camera, and projecting a real- time updated topography map with topographic contour lines, hillshading, and an optional real-time water flow simulation back onto the sand surface using a calibrated projector
License
NotificationsYou must be signed in to change notification settings
kaoDev/SARndbox
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
========================================================================README for Augmented Reality Sandbox (SARndbox) version 1.6Copyright (c) 2012-2015 Oliver Kreylos========================================================================Overview========The Augmented Reality Sandbox is an augmented reality applicationscanning a sand surface using a Kinect 3D camera, and projecting a real-time updated topography map with topographic contour lines, hillshading,and an optional real-time water flow simulation back onto the sandsurface using a calibrated projector.Requirements============The Augmented Reality Sandbox requires Vrui version 3.1 build 004 ornewer, and the Kinect 3D Video Capture Project version 2.8 or newer.Installation Guide==================It is recommended to download or move the source packages for Vrui, theKinect 3D Video Capture Project, and the Augmented Reality Sandbox intoa src directory underneath the user's home directory. Otherwise,references to ~/src in the following instructions need to be changed.It is also recommended to skip optional steps 4 and 6 in the followinginstructions. The Augmented Reality Sandbox does not need to beinstalled in order to be used; installation (to a system directory suchas /usr/local) is only recommended if the Augmented Reality Sandbox willbe used from multiple user accounts.0. Install Vrui from ~/src/Vrui-<version>-<build> (see Vrui README file).0.5 Install the Kinect 3D Video Capture Project from ~/src/Kinect-<version> (see the Kinect 3D Video Capture Project README file).1. Change into ~/src directory and unpack the Augmented Reality Sandbox tarball: > cd ~/src > tar xfz <download path>/SARndbox-<version>.tar.gz - or - > tar xf <download path>/SARndbox-<version>.tar2. Change into the Augmented Reality Sandbox's base directory: > cd SARndbox-<version>3. If the Vrui version installed in step 0 was not 3.1, or Vrui's installation directory was changed from the default of ~/Vrui-3.1, adapt the makefile using a text editor. Change the value of VRUI_MAKEDIR close to the beginning of the file as follows: VRUI_MAKEDIR := <Vrui install dir>/share/make Where <Vrui install dir> is the installation directory chosen in step 0. Use $(HOME) to refer to the user's home directory instead of ~.4. Optional: Adapt makefile if the Augmented Reality Sandbox is to be installed in a different location, for example /usr/local. Set INSTALLDIR to the desired target location. The Augmented Reality Sandbox will then be installed in <INSTALLDIR>/bin, and its configuration files will be installed in <INSTALLDIR>/etc (where <INSTALLDIR> is the value of INSTALLDIR set in the makefile).5. Build the Augmented Reality Sandbox: > make6. Optional: Install the Augmented Reality Sandbox in the selected target location. This is only necessary if the INSTALLDIR variable in the makefile was changed. By default, the Augmented Reality Sandbox can be run from its base directory. To install: > make install - or, if the target location is a system directory - > sudo make install This will copy all executables into <INSTALLDIR>/bin, and all configuration files into <INSTALLDIR>/etc.7. Optional: Add directory containing the Augmented Reality Sandbox executable (~/src/SARndbox-<version>/bin in the default installation, <INSTALLDIR>/bin otherwise) to the user's search path. This allows running the Augmented Reality Sandbox from any directory. Using csh or tcsh: > setenv PATH ${PATH}:~/src/SARndbox-<version>/bin - or - > setenv PATH ${PATH}:<INSTALLDIR>/bin where <INSTALLDIR> is the target location set in the makefile. Using bash: > export PATH=${PATH}:~/src/SARndbox-<version>/bin - or - > export PATH=${PATH}:<INSTALLDIR>/bin These lines can also be added to the user's .cshrc or .bashrc files to make the additions persist between logins.Use===The Augmented Reality Sandbox package contains the sandbox applicationitself, SARndbox, and a calibration utility to interactively measure atransformation between the Kinect camera scanning the sandbox surface,and the projector projecting onto it. The setup procedure describedbelow also uses several utilities from the Kinect 3D video captureproject.Setup and Calibration---------------------Before the Augmented Reality Sandbox can be used, the hardware (physicalsandbox, Kinect camera, and projector) has to be set up properly, andthe various components have to be calibrated internally and with respectto each other. While the sandbox can be run in "trial mode" with verylittle required setup, for the full effect the following steps have tobe performed in order:1. (Optional) Calculate per-pixel depth correction coefficients for the Kinect camera.2. (Optional) Internally calibrate the Kinect camera.3. Mount the Kinect camera above the sandbox so that it is looking straight down, and can see the entire sand surface. Use RawKinectViewer from the Kinect 3D video capture project to line up the depth camera while ignoring the color camera.4. Measure the base plane equation of the sand surface relative to the Kinect camera's internal coordinate system using RawKinectViewer's plane extraction tool. (See "Using Vrui Applications" in the Vrui HTML documentation on how to use RawKinectViewer, and particularly on how to create / destroy tools.)5. Measure the extents of the sand surface relative to the Kinect camera's internal coordinate system using KinectViewer and a 3D measurement tool.6. Mount the projector above the sand surface so that it projects its image perpendicularly onto the flattened sand surface, and so that the projector's field-of-projection and the Kinect camera's field-of- view overlap as much as possible. Focus the projector to the flattened average-height sand surface.7. Calculate a calibration matrix from the Kinect camera's camera space to projector space using the CalibrateProjector utility and a circular calibration target (a CD with a fitting white paper disk glued to one surface).8. Test the setup by running the Augmented Reality Sandbox application.Step 1: Per-pixel depth correction----------------------------------Kinect cameras have non-linear distortions in their depth measurementsdue to uncorrected lens distortions in the depth camera. The Kinect 3Dvideo capture project has a calibration tool to gather per-pixelcorrection factors to "straighten out" the depth image.To calculate depth correction coefficients, start the RawKinectViewerutility and create a "Calibrate Depth Lens" tool. (See "Using VruiApplications" in the Vrui HTML documentation on how to create tools.)Then find a completely flat surface, and point the Kinect cameraperpendicularly at that surface from a variety of distances. Ensure thatthe depth camera only sees the flat surface and no other objects, andthat there are no holes in the depth images.Then capture one depth correction tie point for each distance betweenthe Kinect camera and the flat surface:1. Line up the Kinect camera.2. Capture an average depth frame by selecting the "Average Frames" main menu item, and wait until a static depth frame is displayed.3. Create a tie point by pressing the first button bound to the "Calibrate Depth Lens" tool.4. De-select the "Average Frames" main menu item, and repeat from step 1 until the surface has been captured from sufficiently many distances.After all tie points have been collected:5. Press the second button bound to the "Calibrate Depth Lens" tool to calculate the per-pixel depth correction factors based on the collected tie points. This will write a depth correction file to the Kinect 3D video capture project's configuration directory, and print a status message to the terminal.Step 2: Internally calibrate the Kinect camera----------------------------------------------Individual Kinect cameras have slightly different internal layouts andslightly different optical properties, meaning that their internalcalibrations, i.e., the projection matrices defining how to projectdepth images back out into 3D space, and how to project color imagesonto those reprojected depth images, differ individually as well. Whileall Kinects are factory-calibrated and contain the necessary calibrationdata in their firmware, the format of those data is proprietary andcannot be read by the Kinect 3D video capture project software, meaningthat each Kinect camera has to be calibrated internally before it can beused. In practice, the differences are small, and a Kinect camera can beused without internal calibration by assigning default calibrationvalues, but it is strongly recommended to perform calibration on eachdevice individually.The internal calibration procedure requires a semi-transparentcalibration target; precisely, a checkerboard with alternating clear andopaque tiles. Such a target can be constructed by gluing a large sheetof paper to a clear glass plate, drawing or ideally printing acheckerboard onto it, and cutting out all "odd" tiles using large rulersand a sharp knife. It is important that the tiles are lined up preciselyand have precise sizes, and that the clear tiles are completely cleanwithout any dust, specks, or fingerprints. Calibration targets can havea range of sizes and numbers of tiles, but we found the ideal target tocontain 7x5 tiles of 3.5"x3.5" each.Given an appropriate calibration target, the calibration process isperformed using RawKinectViewer and its "Draw Grids" tool. The procedureis to show the calibration target to the Kinect camera from a variety ofangles and distances, and to capture a calibration tie point for eachviewpoint by fitting a grid to the target's images in the depth andcolor streams interactively.The detailed procedure is:1. Aim Kinect camera at calibration target from a certain position and angle. It is important to include several views where the calibration target is seen at an angle.2. Capture an average depth frame by selecting the "Average Frames" main menu item, and wait until a static depth frame is displayed.3. Drag the virtual grids displayed in the depth and color frames using the "Draw Grid" tool's first button until the virtual grids exactly match the calibration target. Matching the target in the depth frame is relatively tricky due to the inherent fuzziness of the Kinect's depth camera. Doing this properly will probably take some practice. The important idea is to get a "best fit" between the calibration target and the grid. For the particular purpose of the Augmented Reality Sandbox, the color frame grids can be completely ignored because only the depth camera is used; however, since calibration files are shared between all uses of the Kinect 3D video capture project, it is best to perform a full, depth and color, calibration.4. Press the "Draw Grid" tool's second button to store the just-created calibration tie point.5. Deselect the "Average Frames" main menu entry, and repeat from step 1 until a sufficient number of calibration tie points have been captured. The set of all tie points already selected can be displayed by pressing the "Draw Grid" tool's third button.After all tie points have been collected:6. Press the "Draw Grid" tool's fourth button to calculate the Kinect camera's internal calibration parameters. These will be written to an intrinsic parameter file in the Kinect 3D video capture project's configuration directory.This calibration step is illustrated in the following tutorial video:http://www.youtube.com/watch?v=Qo05LVxdlfoStep 3: Mount the Kinect camera above the sandbox-------------------------------------------------In theory, the Kinect camera can be aimed at the sand surface from anyposition and/or angle, but for best results, we recommend to positionthe camera such that it looks straight down onto the surface, and suchthat the depth camera's field-of-view exactly matches the extents of thesandbox. RawKinectViewer can be used to get real-time visual feedbackwhile aligning the Kinect camera.Step 4: Measure the base plane equation of the sand surface-----------------------------------------------------------Because the Kinect camera can be aimed at the sand surface arbitrarily,the Augmented Reality Sandbox needs to know the equation of the "baseplane" corresponding to the average flattened sand surface, and the "updirection" defining elevation above or below that base plane.The base plane can be measured using RawKinectViewer and the "ExtractPlanes" tool. Flatten and average the sand surface such that it isexactly horizontal, or place a flat board above the sand surface. Thencapture an average depth frame by selecting the "Average Frames" mainmenu entry, and wait until the depth image stabilizes. Now use the"Extract Planes" tool to draw a rectangle in the depth frame that *only*contains the flattened sand surface. After releasing the "ExtractPlanes" tool's button, the tool will calculate the equation of the planebest fitting the selected depth pixels, and print two versions of thatplane equation to the terminal: the equation in depth image space, andthe equation in camera space. Of these, only the second is important.The tool prints the camera-space plane equation in the formx * (normal_x, normal_y, normal_z) = offsetThis equation has to be entered into the sandbox layout file, which isby default called BoxLayout.txt and contained in the Augmented RealitySandbox's configuration directory. The format of this file is simple:the first line contains the sandbox's base plane equation in the form(normal_x, normal_y, normal_z), offsetThe plane equation printed by the "Extract Planes" tool only needs to bemodified slightly when pasting it into the sandbox layout file: the "x*" part has to be removed, and the equal sign has to be replaced by acomma. The other four lines in the sandbox layout file are filled in inthe next calibration step.The base plane equation defines the zero elevation level of the sandsurface. Since standard color maps equate zero elevation with sea level,and due to practical reasons, the base plane is often measured above theflattened average sand surface, it might be desirable to lower the zeroelevation level. This can be done easily be editing the sandbox layoutfile. The zero elevation level can be shifted upwards by increasing theoffset value (the fourth component) of the plane equation, and can beshifted downwards by decreasing the offset value. The offset value ismeasured in cm; therefore, adding 10 to the original offset value willmove sea level 10 cm upwards.This calibration step is illustrated in the following tutorial video:http://www.youtube.com/watch?v=9Lt4J_BErs0Step 5: Measure the extents of the sand surface-----------------------------------------------The Augmented Reality Sandbox needs to know the lateral extents of thevisible sand surface with respect to the base plane. These are definedby measuring the 3D positions of the four corners of the flattenedaverage sand surface using RawKinectViewer and a 3D measurement tool,and then entering those positions into the sandbox layout file.Start RawKinectViewer, and create a 3D measurement tool by assigning a"Measure 3D Positions" tool to some button. Then measure the 3Dpositions of the four corners of the flattened sand surface in the orderlower left, lower right, upper left, upper right; in other words, form amirrored Z starting in the lower left.To measure a 3D position, press and release the button to which themeasurement tool was bound inside the depth frame. This will query thecurrent depth value at the selected position, project it into 3D cameraspace, and print the resulting 3D position to the console. Simply pastethe four corner positions, in the order mentioned above, into thesandbox layout file.Step 6: Mount the projector above the sandbox---------------------------------------------Just like with the Kinect camera, the Augmented Reality Sandbox iscapable of dealing with arbitrary projector alignments. As long as thereis some overlap between the Kinect camera's field-of-view and theprojector's projection area, the two can be calibrated with respect toeach other. However, for several reasons, it is best to align theprojector carefully such that it projects perpendicularly to theflattened average sand surface. The main reason is pixel distortion: ifthe projection is wildly off-axis, the size of projected pixels willchange sometimes drastically along the sand surface. While the AugmentedReality Sandbox can account for overall geometric distortion, it cannotchange the size of displayed pixels, and the projected image looks bestif all pixels are approximately square and the same size.Some projectors, especially short-throw projectors, have off-axisprojections, meaning that the image is not centered on a line comingstraight out of the projection lens. In such cases, perpendicularprojection does not imply that the projector is laterally centered abovethe sandbox; in fact, it will have to be mounted off to one side. Thecriterion to judge perpendicular projection is that the projected imageappears as a rectangle, not a trapezoid.We strongly recommend against using any built-in keystone correction aparticular projector model might provide. The Augmented Reality Sandboxcorrects for keystoning internally, and projector-based keystonecorrection works on an already pixelated image, meaning that it severelydegrages image quality. Never use keystone correction. Align theprojector as perpendicularly as possible, and let the Augmented RealitySandbox handle the rest.The second reason to aim for perpendicular projections is focus.Projector images are focused in a plane perpendicular to the projectiondirection, meaning that only a single line of the projected image willbe in correct focus when a non-perpendicular projection is chosen.Either way, after the projector has been mounted, we recommend to focusit such that the entirety of the flattened average sand surface is asmuch in focus as possible.On a tangential note, we also strongly recommend to only run projectorsat their native pixel resolutions. Most projector models will support awide range of input video formats to accomodate multiple uses, but usingany resolution besides the one corresponding to the projector's imagegenerator is a very bad idea because the projector will have to rescalethe input pixel grid to its native pixel grid, which causes severedegradation in image quality. Some projectors "lie" about theircapabilities to seem more advanced, resulting in a suboptimal resolutionwhen using plug&play or automatic setups. It is always a good idea tocheck the projector's specification for its native resolution, andensure that the graphics card uses that resolution when the projector isconnected.Step 7: Calculate the projector calibration matrix--------------------------------------------------The most important step to create a true augmented reality display is tocalibrate the Kinect camera capturing the sand surface and the projectorprojecting onto it with respect to each other, so that the projectedcolors and topographic contour lines appear exactly in the right place.Without this calibration, the Augmented Reality Sandbox is just asandbox with some projection.This calibration step is performed using the CalibrateProjector utility,and a custom calibration target. This target has to be a flat circulardisk whose exact center point is marked in some fashion. We recommend touse an old CD, glue a white paper disk of the proper size to one side,and draw two orthogonal lines through the CD's center point onto thepaper disk. It is important that the two lines intersect in the exactcenter of the disk.The calibration procedure is to place the disk target into the Kinectcamera's field-of-view in a sequence of prescribed positions, guided bythe projector. When CalibrateProjector is started, it will first capturea background image of the current sand surface; it is important that thesurface is not disturbed during or after this capture step, and that noother objects are between the Kinect camera and the sand surface.Afterwards, CalibrateProjector will collect a sequence of 3D tie points.For each tie point, it will display two intersecting lines. The user hasto position the disk target such that the projected lines exactlyintersect in the disk's center point, and such that the disk surface isparallel to the flattened average sand surface, i.e., the base planethat was collected in a previous calibration step. It is important toplace the disk at a variety of elevations above and ideally below thebase surface to collect a full 3D calibration matrix. If all tie pointsare in the same plane, the calibration procedure will fail.The exact procedure is as follows:1. Start CalibrateProjector and wait for it to collect a background frame. Background capture is active while the screen is red. It is essential to run CalibrateProjector in full-screen mode on the projector, or the resulting calibration will be defective. See the Vrui user's manual on how to force Vrui applications to run at the proper position and size. Alternatively, switch CalibrateProjector into full-screen mode manually by pressing the Win+f key combination. When started, CalibrateProjector must be told the exact pixel size of the projector's image using the -s <width> <height> command line option. Using a wrong pixel size will result in a defective calibration. The recommended BenQ short-throw projector has 1024x768 pixels, which is also the default in the software. In other words, when using an XGA-resolution projector, the -s option is not required.2. Create a "Capture" tool and bind it to two keys (here "1" and "2"). Press and hold "1" and move the mouse to highlight the "Capture" item in the tool selection menu that pops up. Then release "1" to select the highlighted item. This will open a dialog box prompting to press a second key; press and release "2". This will close the dialog box. Do not press "1" again when the dialog box is still open; that will cancel the tool creation process. This process binds functions to two keys: "1" will capture a tie point, and "2" will re-capture the background sand surface. "2" should only be pressed if the sand surface changes during the calibration procedure, for example if a hole is dug to capture a lower tie point. After any change to the sand surface, remove the calibration object and any other objects, press "2", and wait for the screen to turn black again.3. Place the disk target at some random elevation above or below the flattened average sand surface such that the intersection of the projected white lines exactly coincides with the target's center point.4. Remove your hands from the disk target and confirm that the target is seen by the Kinect camera. CalibrateProjector will display all non-background objects as yellow blobs, and the object it identified as the calibration target as a green blob. Because there is no calibration yet, the green blob corresponding to the disk target will not be aligned with the target; simply ensure that there is a green blob, that it is circular and stable, and that it matches the actual calibration target (put your hand next to it, and see if the yellow blob matching your hand appears next to the green blob).5. Press the "Capture" tool's first button ("1"), and wait until the tie point is captured. Do not move the calibration target or hold any objects above the sand surface while a tie point is captured.6. CalibrateProjector will move on to the next tie point position, and display a new set of white lines. Repeat from step 3 until all tie points have been captured. Once the full set has been collected, CalibrateProjector will calculate the resulting calibration matrix, print some status information, and write the matrix to a file inside the Augmented Reality Sandbox's configuration directory. The user can continue to capture more tie points to improve calibration as desired; the calibration file will be updated after every additional tie point. Simply close the application window when satisfied. Additionally, after the first round of tie points has been collected, CalibrateProjector will track the calibration target in real-time and indicate its position with red crosshairs. To check calibration quality, place the target anywhere in or above the sandbox, remove your hands, and ensure that the red crosshairs intersect in the target's center.This calibration step is illustrated in the following tutorial video:http://www.youtube.com/watch?v=V_-Qn7oEsn4This older video:http://www.youtube.com/watch?v=vXkA9gUoSAcshows the previous calibration procedure, and does no longer apply.Step 8: Run the Augmented Reality Sandbox-----------------------------------------At this point, calibration is complete. It is now possible to run themain Augmented Reality Sandbox application from inside the source codedirectory (or the -- optionally chosen -- installation directory):> ./bin/SARndbox -fpvThe -fpv option tells the AR Sandbox to use the projector calibrationmatrix created in step 7.It is very important to run the application in full-screen mode on theprojector, or at least with the exact same window position and size asCalibrateProjector in step 7. If this is not done correctly, thecalibration will not work as desired. To manually switch SARndbox intofull-screen mode after start-up, press the Win+f key combination.To check the calibration, observe how the projected colors andtopographic contour lines exactly match the physical features of thesand surface. If there are discrepancies between the two, repeatcalibration step 7 until satisfied. On a typical 40"x30" sandbox, wherethe Kinect is mounted approximately 38" above the sandbox's centerpoint, and using a perpendicularly projecting 1024x768 projector,alignment between the real sand surface and the projected featuresshould be on the order of 1 mm.SARndbox provides a plethora of configuration files and command lineoptions to fine-tune the operation of the Augmented Reality Sandbox asdesired. Run SARndbox -h to see the full list of options and theirdefault values, or refer to external documentation on the project's website.Note on water simulation------------------------Without the real-time water simulation, the Augmented Reality Sandboxhas very reasonable hardware requirements. Any current PC with anycurrent graphics card should be able to run it smoothly. The watersimulation, on the other hand, places extreme load even on high-endcurrent hardware. We therefore recommend to turn off the watersimulation, using the -ws 0.0 0 command line option to SARndbox, orreducing its resolution using the -wts <width> <height> command lineoption with small sizes, e.g., -wts 200 150, for initial testing orunless the PC running the Augmented Reality Sandbox has a top-of-theline CPU, a high-end gaming graphics card, e.g., an Nvidia GeForce 970,and the vendor-supplied proprietary drivers for that graphics card.
About
Augmented reality application scanning a sand surface using a Kinect 3D camera, and projecting a real- time updated topography map with topographic contour lines, hillshading, and an optional real-time water flow simulation back onto the sand surface using a calibrated projector
Resources
License
Stars
Watchers
Forks
Packages0
No packages published
Languages
- C++97.1%
- GLSL2.6%
- Other0.3%