Bounds Unity



BoundsControl is the new component for manipulation behaviour, previously found in BoundingBox. Bounds control makes a number of improvements and simplifications in setup and adds new features. This component is a replacement for the bounding box, which will be deprecated.

  1. Unity Bounds Center
  2. Unity Bounds Intersects
  3. Collider Bounds Unity
  4. Unity3d Bounds

The BoundsControl.cs script provides basic functionality for transforming objects in mixed reality. A bounds control will show a box around the hologram to indicate that it can be interacted with. Handles on the corners and edges of the box allow scaling, rotating or translating the object. The bounds control also reacts to user input. On HoloLens 2, for example, the bounds control responds to finger proximity, providing visual feedback to help perceive the distance from the object. All interactions and visuals can be easily customized.

Normally in Unity, with a visible object, when you want to get the Unity Bounds, if you want them in world space, renderer.bounds returns the Unity Bounds in world space. If you want them in local space, mesh.bounds returns the Unity Bounds in local space. This is a basic of Unity and you do it all the time. Unity 2d Camera Bounds - bound objects with camerahow to access camera bounds and attach gameobjects with bounds.download code here:http://urdutechtutorials.

  • Which will prevent our camera to go past the maximum and minimum x and y positions. Now save the script and go into the unity editor and Drag your GameObject with the CameraBounds.cs script and assign the reference to the bounds object in the CameraFollow.cs script in the scene.
  • An example on how to calculate bounds of a moved/rotated/scaled object in Unity. Useful to know if you need to have accurate bounds of an object that is rotating. This data could be used for something like collision calculations. I just wrote this as a test/reminder snippet for myself. Viewport visualization of the rotated bounds.

Example scene

You can find examples of bounds control configurations in the BoundsControlExamples scene.

Inspector properties

Target object

Bounds Unity

This property specifies which object will get transformed by the bounds control manipulation. If no object is setit defaults to the owner object.

Activation behavior

There are several options to activate the bounds control interface.

  • Activate On Start: Bounds control becomes visible once the scene is started.
  • Activate By Proximity: Bounds control becomes visible when an articulated hand is close to the object.
  • Activate By Pointer: Bounds control becomes visible when it is targeted by a hand-ray pointer.
  • Activate By Proximity and Pointer: Bounds control becomes visible when it is targeted by a hand-ray pointer or an articulated hand is close to the object.
  • Activate Manually: Bounds control does not become visible automatically. You can manually activate it through a script by accessing the boundsControl.Active property.

Bounds override

Sets a box collider from the object for bounds computation.

Box padding

Adds a padding to the collider bounds used to calculate the extents of the control. This will influence not only interaction but also impact the visuals.

Flatten axis

Indicates whether the control is flattened in one of the axes, making it 2 dimensional and disallowing manipulation along that axis. This feature can be used for thin objects like slates.If flatten axis is set to Flatten Auto the script will automatically pick the axis with the smallest extent as flatten axis.

Smoothing

The smoothing section allows to configure smoothing behavior for scale and rotate of the control.

Visuals

The appearance of bounds control can be configured by modifying one of the corresponding visuals configurations.Visual configurations are either linked or inlined scriptable objects and are described in more detail in the configuration object section.

Configuration Objects

The control comes with a set of configuration objects that can be stored as scriptable objects and shared between different instances or prefabs. Configurations can be shared and linked either as individual scriptable asset files or nested scriptable assets inside of prefabs. Further configurations can also be defined directly on the instance without linking to an external or nested scriptable asset.

Bounds Unity

The bounds control inspector will indicate whether a configuration is shared or inlined as part of the current instance by showing a message in the property inspector. In addition shared instances won't be editable directly in the bounds control property window itself, but instead the asset it's linking to has to be directly modfied to avoid any accidental changes on shared configurations.

Currently bounds control offers configuration objects options for the following features:

  • Handles

Box configuration

The box configuration is responsible for rendering a solid box with bounds defined via collider size and box padding. The following properties can be set up:

  • Box material: defines the material applied to the rendered box when no interaction takes place. A box will only be rendered if this material is set.
  • Box grabbed material: material for the box when the user interacts with the control by grabbing via near or far interaction.
  • Flatten axis display scale: a scale that is applied to the box display if one of the axes is flattened.

Scale handles configuration

This property drawer allows to modify behavior and visualization of scale handles of bounds control.

  • Handle material: material applied to the handles.
  • Handle grabbed material: material applied to the grabbed handle.
  • Handle prefab: optional prefab for the scale handle. If non is set MRTK will use a cube as default.
  • Handle size: size of the scale handle.
  • Collider padding: padding to add to the handle collider.
  • Draw tether when manipulating: when active will draw a tether line from point of start of interaction to current hand or pointer position.
  • Handles ignore collider: if a collider gets linked here, handles will ignore any collision with this collider.
  • Handle slate prefab: prefab to use for the handle when the control is flattened.
  • Show scale handles: controls visibility of the handle.
  • Scale behavior: can be set to uniform or non-uniform scaling.

Rotation handles configuration

This configuration defines the rotation handle behavior.

  • Handle material: material applied to the handles.
  • Handle grabbed material: material applied to the grabbed handle.
  • Handle prefab: optional prefab for the handle. If non is set MRTK will use a sphere as default.
  • Handle size: size of the handle.
  • Collider padding: padding to add to the handle collider.
  • Draw tether when manipulating: when active will draw a tether line from point of start of interaction to current hand or pointer position.
  • Handles ignore collider: if a collider gets linked here, handles will ignore any collision with this collider.
  • Handle prefab collider type: collider type to be used with the created handle.
  • Show handle for X: controls visibility of the handle for X axis.
  • Show handle for Y: controls visibility of the handle for Y axis.
  • Show handle for Z: controls visibility of the handle for Z axis.

Translation handles configuration

Allows enabling and configuring translation handles for bounds control. Note that translation handles are disabled per default.

  • Handle material: material applied to the handles.
  • Handle grabbed material: material applied to the grabbed handle.
  • Handle prefab: optional prefab for the handle. If non is set MRTK will use a sphere as default.
  • Handle size: size of the handle.
  • Collider padding: padding to add to the handle collider.
  • Draw tether when manipulating: when active will draw a tether line from point of start of interaction to current hand or pointer position.
  • Handles ignore collider: if a collider gets linked here, handles will ignore any collision with this collider.
  • Handle prefab collider type: collider type to be used with the created handle.
  • Show handle for X: controls visibility of the handle for X axis.
  • Show handle for Y: controls visibility of the handle for Y axis.
  • Show handle for Z: controls visibility of the handle for Z axis.

Links configuration (wireframe)

The links configuration enables the wireframe feature of bounds control. The following properties can be configured:

  • Wireframe material: the material applied to the wireframe mesh.
  • Wireframe edge radius: the thickness of the wireframe.
  • Wireframe shape: shape of the wireframe can by either cubic or cylindrical.
  • Show wireframe: controls visibility of the wireframe.

Proximity effect configuration

Show and hide the handles with animation based on the distance to the hands. It has two-step scaling animation. Defaults are set to HoloLens 2 style behavior.

  • Proximity Effect Active: Enable proximity-based handle activation
  • Object Medium Proximity: Distance for the 1st step scaling
  • Object Close Proximity: Distance for the 2nd step scaling
  • Far Scale: Default scale value of the handle asset when the hands are out of range of the bounds control interaction (distance defined above by 'Handle Medium Proximity'. Use 0 to hide handle by default)
  • Medium Scale: Scale value of the handle asset when the hands are within range of the bounds control interaction (distance defined above by 'Handle Close Proximity'. Use 1 to show normal size)
  • Close Scale: Scale value of the handle asset when the hands are within range of the grab interaction (distance defined above by 'Handle Close Proximity'. Use 1.x to show bigger size)
  • Far Grow Rate: Rate a proximity scaled object scales when the hand moves from medium to far proximity.
  • Medium Grow Rate: Rate a proximity scaled object scales when the hand moves from medium to close proximity.
  • Close Grow Rate: Rate a proximity scaled object scales when the hand moves from close proximity to object center.

Constraint System

Bounds control supports using the constraint manager to limit or modify translation, rotation or scaling behavior while using bounds control handles.

The property inspector will show all available constraint managers attached to the same game object in a dropdown with an option to scroll and highlight the selected constraint manager.

Events

Bounds control provides the following events. This example uses these events to play audio feedback.

  • Rotate Started: Fired when rotation starts.
  • Rotate Stopped: Fired when rotation stops.
  • Scale Started: Fires when scaling starts.
  • Scale Stopped: Fires when scaling stops.
  • Translate Started: Fires when translation starts.
  • Translate Stopped: Fires when translation stops.

Elastics (Experimental)

Elastics can be used when manipulating objects via bounds control. Note that the elastics system is still in experimental state. To enable elastics either link an existing elastics manager component or create and link a new elastics manager via the Add Elastics Manager button.

Handle styles

By default, when you just assign the BoundsControl.cs script, it will show the handle of the HoloLens 1st gen style. To use HoloLens 2 style handles, you need to assign proper handle prefabs and materials.

Below are the prefabs, materials, and the scaling values for the HoloLens 2 style bounds control handles. You can find this example in the BoundsControlExamples scene.

Handles (Setup for HoloLens 2 style)

  • Handle Material: BoundingBoxHandleWhite.mat
  • Handle Grabbed Material: BoundingBoxHandleBlueGrabbed.mat
  • Scale Handle Prefab: MRTK_BoundingBox_ScaleHandle.prefab
  • Scale Handle Slate Prefab: MRTK_BoundingBox_ScaleHandle_Slate.prefab
  • Scale Handle Size: 0.016 (1.6cm)
  • Scale Handle Collider Padding: 0.016 (makes the grabbable collider slightly bigger than handle visual)
  • Rotation Handle Prefab: MRTK_BoundingBox_RotateHandle.prefab
  • Rotation Handle Size: 0.016
  • Rotation Handle Collider Padding: 0.016 (makes the grabbable collider slightly bigger than handle visual)

Transformation changes with object manipulator

A bounds control can be used in combination with ObjectManipulator.cs to allow for certain types of manipulation (eg. moving the object) without using handles. The manipulation handler supports both one and two-handed interactions. Hand tracking can be used to interact with an object up close.

In order for the bounds control edges to behave the same way when moving it using ObjectManipulator's far interaction, it is advised to connect its events for On Manipulation Started / On Manipulation Ended to BoundsControl.HighlightWires / BoundsControl.UnhighlightWires respectively, as shown in the screenshot above.

How to add and configure a bounds control using Unity Inspector

  1. Add Box Collider to an object
  2. Assign BoundsControl script to an object
  3. Configure options, such as 'Activation' methods (see Inspector properties section below)
  4. (Optional) Assign prefabs and materials for a HoloLens 2 style bounds control (see Handle styles section below)
Note

Use Target Object and Bounds Override field in the inspector to assign specific object and collider in the object with multiple child components.

How to add and configure a bounds control in the code

  1. Instantiate cube GameObject

  2. Assign BoundsControl script to an object with collider, using AddComponent<>()

  3. Configure options either directly on the control or via one of the scriptable configurations (see Inspector properties and Configurations section below)

  4. (Optional) Assign prefabs and materials for a HoloLens 2 style bounds control. This still requires assignments through the inspector since the materials and prefabs should be dynamically loaded.

Note

Using Unity's 'Resources' folder or Shader.Find for dynamically loading shaders is not recommended since shader permutations may be missing at runtime.

Example: Set minimum, maximum bounds control scale using MinMaxScaleConstraint

To set the minimum and maximum scale, attach a MinMaxScaleConstraint to your control. As bounds control automatically attaches and activates constraint manager the MinMaxScaleConstraint will be automatically applied to the transformation changes once it's attached and configured.

You can also use MinMaxScaleConstraint to set minimum and maximum scale for ObjectManipulator.

Example: Add bounds control around a game object

To add a bounds control around an object, simply add a BoundsControl component to it:

Migrating from Bounding Box

Existing prefabs and instances using bounding box can be upgraded to the new bounds control via the migration window which is part of the MRTK tools package.

For upgrading individual instances of bounding box there's also an a migration option inside the property inspector of the component.

See also

-->

In this tutorial, you learn how to:

  • Add visual and manipulation bounds around remotely rendered models
  • Move, rotate, and scale
  • Raycast with spatial queries
  • Add simple animations for remotely rendered objects

Prerequisites

  • This tutorial builds on Tutorial: Interfaces and custom models.

Query remote object bounds and apply to local bounds

To interact with remote objects, we need a local representation to interact with first. The objects bounds are useful for quick manipulation of a remote object. The remote bounds can be queried from ARR, using the local Entity as a reference. The bounds are queried after the model has been loaded into the remote session.

The bounds of a model are defined by the box that contains the entire model - just like Unity's BoxCollider, which has a center and size defined for the x, y, z axes. In fact, we'll use Unity's BoxCollider to represent the bounds of the remote model.

  1. Create a new script in the same directory as RemoteRenderedModel and name it RemoteBounds.

  2. Replace the contents of the script with the following code:

    Note

    If you see an error in Visual Studio claiming Feature 'X' is not available in C# 6. Please use language version 7.0 or greater, these error can be safely ignored. This is related to Unity's Solution and Project generation.

    This script should be added to the same GameObject as the script that implements BaseRemoteRenderedModel. In this case, that means RemoteRenderedModel. Similar to previous scripts, this initial code will handle all the state changes, events, and data related to remote bounds.

    There is only one method left to implement: QueryBounds. QueryBounds fetches the bounds asynchronously, takes the result of the query and applies it to the local BoxCollider.

    The QueryBounds method is straightforward: send a query to the remote rendering session and await the result.

  3. Replace the QueryBounds method with the following completed method:

    We'll check the query result to see if it was successful. If yes, convert and apply the returned bounds in a format that the BoxCollider can accept.

Now, when the RemoteBounds script is added to the same game object as the RemoteRenderedModel, a BoxCollider will be added if needed and when the model reaches its Loaded state, the bounds will automatically be queried and applied to the BoxCollider.

  1. Using the TestModel GameObject created previously, add the RemoteBounds component.

  2. Confirm the script is added.

  3. Run the application again. Shortly after the model loads, you'll see the bounds for the remote object. You'll see something like the below values:

Now we have a local BoxCollider configured with accurate bounds on the Unity object. The bounds allow for visualization and interaction using the same strategies we'd use for a locally rendered object. For example, scripts that alter the Transform, physics, and more.

Move, rotate, and scale

Moving, rotating, and scaling remotely rendered objects works the same as any other Unity object. The RemoteRenderingCoordinator, in its LateUpdate method, is calling Update on the currently active session. Part of what Update does is sync local model entity transforms with their remote counterparts. To move, rotate, or scale a remotely rendered model, you only need to move, rotate, or scale the transform of the GameObject representing remote model. Here, we're going to modify the transform of the parent GameObject that has the RemoteRenderedModel script attached to it.

This tutorial is using MRTK for object interaction. Most of the MRTK specific implementation for moving, rotating and scaling an object is outside the scope of this tutorial. There is a model view controller that comes pre-configured inside the AppMenu, in the Model Tools menu.

  1. Ensure the TestModel GameObject created previously is in the scene.
  2. Ensure the AppMenu prefab is in the scene.
  3. Press Unity's Play button to play the scene and open the Model Tools menu inside the AppMenu.

The AppMenu has a sub menu Model Tools that implements a view controller for binding with the model. When the GameObject contains a RemoteBounds component, the view controller will add a BoundingBox component, which is an MRTK component that renders a bounding box around an object with a BoxCollider. A ObjectManipulator, which is responsible for hand interactions. These scripts combined will allow us to move, rotate, and scale the remotely rendered model.

  1. Move your mouse to the Game panel and click inside it to give it focus.

  2. Using MRTK's hand simulation, press and hold the left Shift key.

  3. Steer the simulated hand so the hand ray is pointing to the test model.

  4. Hold left click and drag the model to move it.

Unity collider bounds

You should see the remotely rendered content move along with the bounding box. You might notice some delay or lag between the bounding box and the remote content. This delay will depend on your internet latency and bandwidth.

Ray cast and spatial queries of remote models

A box collider around models is suitable for interacting with the entire model, but not detailed enough to interact with individual parts of a model. To solve this, we can use remote ray casting. Remote ray casting is an API provided by Azure Remote Rendering to cast rays into the remote scene and return hit results locally. This technique can be used for selecting child entities of a large model or getting hit result information like position, surface normal, and distance.

The test model has a number of sub-entities that can be queried and selected. For now, the selection will output the name of the selected Entity to the Unity Console. Check the Materials, lighting and effects chapter for highlighting the selected Entity.

First, let's create a static wrapper around the remote ray cast queries. This script will accept a position and direction in Unity space, convert it to the data types accepted by the remote ray cast, and return the results. The script will make use of the RayCastQueryAsync API.

  1. Create a new script called RemoteRayCaster and replace its contents with the following code:

    Note

    Unity has a class named RaycastHit, and Azure Remote Rendering has a class named RayCastHit. The uppercase C is an important difference to avoid compile errors.

    RemoteRayCaster provides a common access point for casting remote rays into the current session. To be more specific, we'll implement an MRTK pointer handler next. The script will implement the IMixedRealityPointerHandler interface, which will tell MRTK that we want this script to listen for Mixed Reality Pointer events.

  2. Create a new script called RemoteRayCastPointerHandler and replace the code with the following code:

RemoteRayCastPointerHandler's OnPointerClicked method is called by MRTK when a Pointer 'clicks' on a collider, like our box collider. After that, PointerDataToRemoteRayCast is called to convert the pointer's result into a point and direction. That point and direction are then used to cast a remote ray in the remote session.

Sending requests for ray casting on click is an efficient strategy for querying remote objects. However, it's not an ideal user experience because the cursor collides with the box collider, not the model itself.

You could also create a new MRTK pointer that casts its rays in the remote session more frequently. Although this is a more complex approach, the user experience would be better. This strategy is outside the scope of this tutorial, but an example of this approach can be seen in the Showcase App, found in the ARR samples repository.

When a ray cast is completed successfully in the RemoteRayCastPointerHandler, the hit Entity is emitted from the OnRemoteEntityClicked Unity event. To respond to that event, we'll create a helper script that accepts the Entity and performs an action on it. Let's start by getting the script to print the name of the Entity to the debug log.

  1. Create a new script named RemoteEntityHelper and replace its contents with the below:

  2. On the TestModel GameObject created previously, add both the RemoteRayCastPointerHandler component and the RemoteEntityHelper component.

  3. Assign the EntityToDebugLog method to the OnRemoteEntityClicked event. When the event's output type and method's input type match, we can use Unity's dynamic event hookup, that will automatically pass the event value into the method.

    1. Create a new callback field
    2. Drag Remote Entity Helper component into the Object field, to reference the parent GameObject
    3. Assign the EntityToDebugLog as the callback
  4. Press play in the Unity Editor to start the scene, connect to a remote session and load the test model.

  5. Using MRTK's hand simulation press and hold the left Shift key.

  6. Steer the simulated hand so the hand ray is pointing to the test model.

  7. Long click to simulate an air-tap, executing the OnPointerClicked event.

  8. Observe the Unity Console for a log message with the name of the child entity selected. For example:

Synchronizing the remote object graph into the Unity hierarchy

Up to this point, we've only seen a single local GameObject representing the entire model. This works well for rendering and manipulation of the entire model. However, if we want to apply effects or manipulate specific sub-entities, we'll need to create local GameObjects to represent those entities. First, we can explore manually in the test model.

  1. Start the scene and load the test model.
  2. Expand the children of the TestModel GameObject in Unity's hierarchy and select the TestModel_Entity GameObject.
  3. In the Inspector, click the Show Children button.
  4. Continue to expand children in the hierarchy and clicking Show Children until a large list of children is shown.

A list of dozens of entities will now populate the hierarchy. Selecting one of them will show the Transform and RemoteEntitySyncObject components in the Inspector. By default, each entity isn't automatically synced every frame, so local changes to the Transform aren't synced to the server. You can check Sync Every Frame and then move, scale, or rotate the transform in the Scene view, you will not see the rendered model in the scene view, watch the Game view to see the model's position and rotation visually update.

The same process can be done programmatically and is the first step in modifying specific remote entities.

Bounds
  1. Modify the RemoteEntityHelper script to also contain the following method:

  2. Add an additional callback to the RemoteRayCastPointerHandler event OnRemoteEntityClicked, setting it to MakeSyncedGameObject.

  3. Using MRTK's hand simulation press and hold the left Shift key.

  4. Steer the simulated hand so the hand ray is pointing to the test model.

  5. Long click to simulate an air-tap, executing the OnPointerClicked event.

  6. Check and expand the Hierarchy to see a new child object, representing the clicked entity.

  7. After testing, remove the callback for MakeSyncedGameObject, since we'll incorporate this as part of other effects later.

Note

Unity Bounds Center

Syncing every frame is only required when you need to sync the transform data. There is some overhead to syncing transforms, so it should be used sparingly.

Unity Bounds Intersects

Creating a local instance and making it automatically sync is the first step in manipulating sub-entities. The same techniques we've used to manipulate the model as a whole can be used on the sub-entities as well. For example, after creating a synced local instance of an entity, you could query its bounds and add manipulation handlers to allow it to be moved around by the user's hand rays.

Collider Bounds Unity

Next steps

Unity3d Bounds

You can now manipulate and interact with your remotely rendered models! In the next tutorial, we'll cover modifying materials, altering the lighting, and applying effects to remotely rendered models.