VR Workbench 360 Image Menu What is a 360 image? A 360 or panorama image, is a special type of image that allows the user to experience being inside the image in 3D. It simulates the user being in the shoes of the photographer, who took the picture, and looking around to the left, right, up and down as desired as well as sometimes zooming.  The user can view the scene on a mobile, using Facebook, using other desktop viewers, or embed it into a website. How to create a 360 image? Create a VR camera, and place it in a good position in the center of the scene. In Render Output Settings it is preferred that the output ratio be set to 2:1, SPP 3000, and Rendering Mode to Interior Renderer with Light and Specular depths set to 100. Then select Fixed to File from the 360 Image menu. This process is shown in the following tutorial. Ways to share your 360 images On Facebook On mobile, using SimLab VR Viewer On  desktop using SimLab VR Viewer Adding code to embed 360 images in your website Sharing 360 images on Facebook For 360 images to be automatically detected on Facebook, width to height ratio should be 2 to 1, for example, if the width is 2000 the height of the image should be 1000.  Another restriction is with maximum file dimensions of 6000 x 3000 pixels Sharing 360 images on Mobile Using the FREE SimLab VR Viewer; with a 360 image created *.png transfer the image to a mobile, and download File Manager (Mobile, AsusTek Computer Inc.). Browse to the 360 image Select File Manager Select the image from the list, and experiment with your 360 images! For more details and examples on how to embed a 360 image in a website visit these links: Link_1, Link_2 Refer to SimLab 360 Technology page Multiple 360s Multiple 360s allow users to view 360 images that have been created and move among them in VR Viewer. To use multiple 360s, Follow the steps below: Create VR Camera from the 360 Image Menu and move the camera to the proper location, duplicate the camera and add multiple cameras in the scene (At least two cameras). Select Multiple 360s from the 360 Image Menu to render multiple 360 images. When the renders are finished, the render result will appear in the Properties Dialog of the VR Camera under the 360 Section.  There the user can preview and change the 360 images, also can make the camera position appear behind walls, and move among the 360 images in the VR Viewer (it's Not Visible by default). To view multiple 360s in VR Viewer, select Show in Viewer from the VR Viewer menu. This will open the Viewer Modes dialog, select Desktop, then from the Start Viewer dialog select 360. 360/Panorama Grid (How to create tutorial): Teleportation Points Use Teleportation and VR cameras to navigate large scenes fast, by activating the X-ray view in SimLab VR/Desktop Viewer. This is a guide showing the user how to walk through, and Teleport a scene High Dynamic Range Image (HDR) Creation HDR (High Dynamic Range images) or environment images, is one of the preferred methods to light a 3D scene for rendering.Originally these images were generated using specialized cameras, or through specialized software. Due to the processes involved in generating HDRs, they usually carry licenses that prevents free distribution.SimLab Composer supports creating: 3D scene HDRs, Studio HDRs, and HDR Studio. Generating HDRs with SimLab Composer SimLab Composer makes it easy to generate HDR images, with the freedom of distributing them, as any other rendered image created using SimLab Composer. With SimLab Composer, the user can create multiple 360 render images from the VR Viewer faster and easier without consuming the processor of the device. View the following tutorial to learn how to create HDR images using SimLab Composer. Sharing HDRs If you generate a special HDR that you want to share with other SimLab Composer users, please send an email to support@simlab-soft.com and we will be glad to share it with the community.VR Effects / Media Menu This menu allows the user to add visual effects and dynamic elements to the VR Experience . It is organized in the following groups: Particle Landscape Media UI Panels Particles Group Includes the below visual effects supported in SimLab Composer/VR Studio: Make FluidCreate FireCreate SmokeCreate Spark Create ShowerCreate SteamCreate Wire/Rope Create HoseCreate Spring Make Fluid Enables users to convert selected 3D shapes into fluids. To learn more about this feature check the following tutorial: Create Fire and Smoke Enables the user to add fire effect to the scene, this can be a large trailing fire, a stove, or a candle.Fire effect automatically includes smoke option to be shown with the fire. If the user wants smoke alone without fire, then Create Smoke can be used.The following tutorial shows the process of creating fire and smoke in Composer/VR Studio: Create Spark Spark effect can be used to show electrical sparks, or welding effect. The following tutorial shows how to create the spark effect: Create Shower For creating moving water or liquid effect in SimLab Composer or VR Studio, this can be done using shower effect. It allows the user to control color, speed and strength of water flow. The following tutorial shows the process of creating the shower effect: Create Steam Creates steam effect in the VR Experience with the VR Properties shown below. The user can change the steam strength and color in the Properties dialog. Create Wire/Rope Creates connecting wires in the scene, like connecting the soldering kit shown below with its electric plug. The following tutorial shows the process of creating a wire: Create Hose Clicking this tool will create two points in the 3D area, a start point and an end point. Adjust their location to the desired location. For more about this tool check this tutorial. Create Spring Creates a 3D spring model with the length specified by the start and end points selected by the user. A 3D Geom for the spring will be added to the Scene Browser with its two endpoints. These points can be selected and transformed, thus changing the shape/length of the spring. With the created spring selected, its Properties panel will appear, VR Spring properties Radius, and Thickness can be edited. Landscape Group Make Water Make water works on one object or group of objects, the process is as follows: From the Effects menu » select Make Water. Select 3D object(s) to become water, then click In the Scene Browser notice that the shape of the icon for the 3D object(s) has changed to a water shape. The water settings can be adjusted from the VR Water tab in the Properties dialog. Water settings are: Water Color: The user can select a color to use for the water. Water Depth, Water Speed, Water Turbidity, Wave Speed, Wave Size and Wave Roughness: These parameters can be used to make more realistic water surface. To remove the water effect, from Scene Browser, top combo box, select VR Landscape. An 'X' mark will appear next to the water geometry, click it to remove the water effect. Make Ground Ground objects are objects the user can teleport to in VR. By default the user does not need to set ground objects. Any object with good orientation (has normal close to the up vector) is considered ground. If the user wants to have more control and wants to prevent others from moving all around the scene, the user can use Make Ground to define the object(s) the user can teleport to. When Make Ground tool is used, default ground is turned off, and only objects defined as ground by the user are considered as ground Push and hold the stick on a VR controller to activate teleportation mode. Then Release to teleport to a valid destination. A green teleportation pointer indicates possible teleportation on an object. A red teleportation pointer indicates forbidden teleportation on an object. Grass Enable users to add realistic grass effect to VR scenes. Selecting this VR effect will open the VR Grass Generator dialog. VR Grass Generation dialog allows the user to control: Generator Name Geometry the grass should be added to, select the geometry then click (+) Overall Coverage with grass from the selected object The percentage of the different kinds of grass to be generated The following video shows a sample of Grass effect in action: The following tutorial shows how to use grass in Composer or VR Studio: Media Group Make Video With Make Video effect, the user can convert any 3D object to a video player. The video should be an MP4 file. The user can use a URL link in which case the video will be streamed during the VR Experience, or a local link for a file on the users machine. Local videos are automatically added to the VR Experience package, so the videos can still play when VR Experiences are distributed The following dialog shows the VR Video properties: To learn how to use Video effect in VR, check the following tutorial: Remove Video To remove a video click the "X" mark in the Scene Browser next to the 3D geometry including the video. Video Action response in VR Training Builder can be used to control the Video play in VR. It can be used to pause, play, toggle, or seek in VR, as a response to an event. For more info about Video Actions in VR Training Builder check this tutorial. Create Sound This function adds 3D Sound object to the scene, and the Scene Browser. Selecting the 3D Sound form the tree will display its properties in the Properties Panel, VR Sound tab. The 3D sound object occupies a location in the 3D scene. The user can determine if the sound should be played on start of the scene, or not. To be played once or in a loop. The type of the 3D Sound can either be Spatial, which means the sound volume is automatically affected by the location of the 3D sound in the scene, or Constant. Constant sounds when played has the same volume in all areas of the scene. For more information about VR Sound check this tutorial Create Web Browser Adds a flat screen VR Web Browser element to the Scene Browser, and the 3D area. In the Properties dialog under the VR Web Browser tab the user can input the URL for the browser to display. The user can change the width and height of the Web Browser plane using the white points that appear on it when selected. The web browser can be placed on a PC screen, or a full wall in the VR Experience. Create Surveillance Plane Surveillance Plane allows the user to view far or not easy to view parts of the scene using a VR Camera. VR Surveillance plane is linked to the camera so it shows what the camera is seeing. The following tutorial shows how to use VR Surveillance. Creating VR Camera is available under VR Viewer menu. VR Slides Generator PowerPoint presentations or PDF files generated in the past can now be utilized easily in VR Experiences. In few clicks you will be able to get the presentation and optionally a small podium version of it for an  amazing VR Experience. VR Slides Generator tool will automatically create the needed training builder blocks to make it easy to utilize slide switching as an event, for example to show/hide a 3D object that works well with your presentation. To learn more about this tool check this tutorial. UI Panel Group Create Variable Writer This tool enables the user to track the value of a VR Variable in the VR experience.Click Here to learn more about creating VR Variables. The value of the variable will be updated dynamically, so whenever the variable value changes the variable writer will be updated to show the new value of the variable. The user can control the size, and place for the variable writer, so it can be aligned on a wall in the scene or on a monitor. User selects which variable to view, color of the text, and prefix text. You can also change its horizontal and vertical text alignment, check "Max String Length" to keep the font size fixed, and finally check "Editable" if you want to be able to edit the variable that it contains directly while running the VR Experience in the Viewer Create Scene States Lister Scene State Lister automates the processes of showing the Scene States in the scenes and allows the user to switch between them during the VR Experience. Adding a Scene State Lister will add it to the 3D area, and the Scene Browser. The user can control Scene State Lister parameters from VR Scene State Lister tab in the Properties Panel, as shown in the following image. The following tutorial shows how to use Scene State Lister in a VR Experience: Interactions Menu Allows users to define interactions in VR scenes.   Grabbing Group Make Grabbable Makes selected object(s) grabbable in the scene. A grabbable object is an object that can be grabbed by a controller in a VR Experience or using the desktop hand and can be moved from one place to another. Grabbable objects are highlighted clearly in the Scene Browser Grabbable objects are highlighted in the Desktop or VR viewing modes with orange or red highlight , as shown in the following image: Make Grabbable Sequence Grabbable sequences allow users to interact with 3D object(s) in the Viewer by linking them to animation sequences. In the image below, an animation to control the arm was saved as a sequence, then linked to the arm object using Grabbable Sequence. When the arm is grabbed in the Viewer the path appears and the user can move the grabbed object along the path. The following tutorial shows how to create and use Grabbable Sequence Dynamics Group Enable Physics The physics feature allows for the object to have physical properties such as gravity, mass and friction. To add physics to an object select Enable Physics then select the object you want. Make Climbable With this feature, selected objects in the scene can be made climbable such as ladders. Just click Make Climbable and select the object you want. Check this tutorial for more about these two features. To adjust the Density and Restitution properties, which impact how objects behave during physics simulations, follow these steps: 1- Choose the material(s) applied to the physics objects. 2- Navigate to the Physics properties. 3- Update the Density and Restitution values according to the instructions illustrated in the accompanying image. Training builder gives the user the option to enable/disable physics on physics objects in the VR Experience. The response for doing that is shown in the following image: Clipping Planes Also known as Section Planes in VR, it can be static, animated, grabbable, and turned on/off dynamically. By determining the Clipping Plane and Clippable Objects the user can make clipping effects for some or all objects in the scene. The clipping plane can be static or it can be animated showing Clippable Objects as it moves, the Clipping Planes tutorial will show you how to use Clipping Planes and assign Clippable Objects From VR Clipping Plane tab in the Properties Panel, the user can select to show or hide the clipping plane. The direction of the clipping plane normal determines the clipped section of the model. Disassembly and Docking Group Make Dock and Target Dock is the static object, and Target is the (grabbable) moving one, so Target needs to be made Grababble first. After selecting this function the user will be asked to Select Dock Node, then to Select Target Node, and last to configure object orientation. In the VR Experience, the user grabs the Target and when it is released (grab ends) while it is intersecting with the Dock, it is moved to the docking target's predefined position. Remove Dock and Target Selecting this function will prompt the user to Select Dock Node, and Select Target Node then the relation will be removed The following tutorial shows how to create Dock and Target interaction in SimLab Composer and Studio: Manage (Dis)Assembly Process In case the assembly/disassembly process is linear, which means it goes in a defined order this function can be used. It simplifies the process and makes it move in a very specific scenario. Highlights will guide the user throughout the assembly/disassembly process with ease and without any confusion. Green highlight object in an assembly refers to the currently available part of the system that can be removed. Yellow highlight indicates that the object can be grabbed. Blue highlight in the assembly indicates that the object can be placed there. Press Right Mouse to toggle the hand. The hand has a fixed position in the view, and will teleport to the grabbable object when you click on it, and will keep grabbing as long as you hold the button. The assembly management interactions, and settings: Plus button: Adds the selected node (from the 3d area) to the disassembly diagram. If a disassembly graph node is selected, it will become the parent of the newly added node. Ladder button: Change the parent of selected disassembly graph nodes, a graph node must be selected before clicking this button, and the graph node selected afterward will become the new parent -or- if a blank space is selected, the graph node will become independent (with no parent). Red Minus button: Removes a node from the graph. Create new: Create a new disassembly diagram. Rename: Change the name of the disassembly diagram. Delete: Remove the disassembly. Note: Training Builder can be used for creating a more open assembly/disassembly process. It can be used to define the logic for the processes, giving the user more control without forcing her/him to adjust the model structure in Scene Browser or go through a single path of exclusion. VR Assembly System Assembly/Disassembly System in SimLab Composer and Studio (Ultimate), seamlessly assemble and disassemble components with precision and flexibility. Key Features of the New System: Logical Order Enforcement: The system ensures a logical process, such as requiring screws to be removed before detaching the parts they secure. However, the sequence of removing screws is flexible, take them out in any order you prefer! Interchangeable Parts Recognition: Recognizes parts like washers and screws as interchangeable, allowing substitutions as needed. Tool Integration: Supports the use of tools to remove or place objects, making the process more realistic and efficient. Optional Ghost Guides Enable ghosted visuals to easily identify the correct placement for objects, simplifying the assembly process. This system is robust enough to handle even large and complex assemblies, as demonstrated in the video below. Support for Subassemblies, Users can disassemble and reassemble subassemblies as needed. A subassembly must be fully reassembled before it can be placed back in the main assembly. Custom Control Over Assembly States, For advanced scenarios, users can use new response options to control the state of individual parts during the assembly process. The tutorial below demonstrates how to effectively use these features: select dependency mode, In mechanical assembly,  creator is able to select dependency mode, it can be All (this was the default in 14) or Any. For example, with screws, user might need to remove all screws before taking the part, so in this case All is used. In other situations, the user might be able to disassemble a part from multiple sides. In that case, use the any option—if the part is free on the left or right, it can be taken apart, that is where Any can be used. This is shown in the following tutorial. Industries Solutions SimLab Composer is continuously adding new features to make the experience of its users more joyful, and easier. Door/Cabinet Animation Automatically creates an animation for an object depending on its type. This can be useful for scenes with multiple objects of the same type. There are two main methods for defining which objects to animate: Object Selection With this method, the user will manually select the components in the scene to create an animation for.This method is suitable for scenes with a small number of objects. Check the following tutorial on automatic animation for Doors/ Cabinets using the Object Selection Method Object Name With this method, the user will type in the name of the components, and the software will automatically find them in the scene and list them.This method is more suitable for scenes with a large number of objects to animate. Check this tutorial on automatic animation for Doors/ Cabinets using the Object Name Method Doors and Cabinets Settings Send animation to timeline: Enabling this option sends the created animation as keyframes to the Animation Timeline. This can be helpful when you intend to combine multiple animations or modify the animation further. Create animation Sequences: This option creates separate animation sequences for each object, those sequences can then be used in VR. Animation sequence naming convention: The user can select the naming method for the sequence. Attach sequences: This makes the scene ready to be used in a VR Experience. Sequences can be applied as Actions (executed when the user clicks on a door or panel) or as Grabbable Sequences (the user can open or close interactively in the VR Viewer) Create reversed versions: Reversed version of animation allows the object to go back to its original state. For example, if you have created an animation for a door opening, a reversed version of the door closing will be automatically created. Visualize Scene Options Visualize scene options (scene states and animation sequences) by displaying them as entities in a 3D world for easier and more practical VR Experiences. Check this tutorial on how to visualize scene options in VR. Press "3" on your keyboard in the VR Viewer to show available Pins, "LBUTTON" to select Scene States/Animation Sequences that appear as entities from the Pins. 1 Select a Scene Node from the Objects Tree or from the 3D area, then click this option to add a Pin to the selected node, showing its scene states if interacted by the user in SimLab VR Viewer. 2 Remove a Trigger and eliminate any associated Pins and scene options. 3 The list of Triggers. 4 Activates Scene States library to drag-and-drop Scene States from it into Pin nodes. 5 Activates Animation Sequences library to drag-and-drop Animation Sequences from it into Pin nodes. 6 Add a new Pin to the selected node to hold other scene options chosen by the user. Once added, either Scene States or Animation Sequences from their corresponding libraries can be dragged and dropped into it. 7 Remove the selected Pin node from its diagram. 8 Scene States/Animation Sequences can be dragged-and-dropped here from their corresponding libraries. 9 Added Scene States/Animation Sequences Visualize Scene Options - Step by step Create multiple Scene States/Animation sequences (at least two), for a 3D model. From the Interactions menu click Visualize Scene Options. Select the object for which to make a list, Ex: Bedside Table, select the table model then Add a Trigger by clicking The object name (Bedside Table) should appear on the Pin and Scene Options side as shown in the image. Drag and drop created Scene States/Animation Sequence (created in step one) from their libraries into VR List1 one by one The Scene States/Animation Sequence will appear under the VR List1 Select List 1 to display its properties, and change its name, and properties Snap and Socket VR Snapping feature makes building scenes in VR easier and more intuitive. Whether you're designing a kitchen or creating an educational lab, VR Snapping allows you to effortlessly add items and align them perfectly without any extra effort. Setting up VR Snapping in Composer/Studio is simple and gives you full control over how it behaves in the VR environment. Watch the video below to learn how to use VR Snapping XR Anchors When viewing scenes in XR mode on your Quest headset, you'll be able to attach 3D models to specific physical locations in the real world. This feature enables you to:Place training models in actual examination rooms.Add visual highlights to real-world objects.Visualize furniture placement in your space. Watch the tutorial below to see XR Anchors in action. User Gadgets Group Add User Gadgets This function will add some gadgets to the Viewer start model, as shown in the dialog below: Adjacent Objects: These are objects that move with the Viewer start model in the VR environment. They can be something like tools (screwdriver or hammer). Check this part of the tutorial for more about Adjacent Objects.Controllers: This allows the user to change the VR controllers with selected 3D models for hands, or gloves. Controllers appearance is only available in VR Mode. The following video shows the effect of using Controllers in the VR experience. The following tutorial shows how to use controllers gadget in SimLab Composer/Studio. Hud: Hud is a plane shown in front of the user so it is easy to show instructions or current status. The following video shows the Hud appearance in a VR Experience. The following tutorial describes how to add/use Hud and Adjacent Objects Change Equipment State response in the Training Builder, which can be used to control equipment state (Equipped or Unequipped) for all gadgets (Adjacent Objects, Controllers, and Hud) Avatar Generator In this dialog users can build their own avatars from a collection of Faces, Uniforms, Headwear, and Glasses. Avatars are used in VR Collaboration, in SimLab VR Viewer. For more about Avatar Types and Creation check this tutorial Voice Commands For improving VR based training that involves interacting with humans in the real world, voice commands support is added. Voice commands are new events that are fired when the appropriate voice command is said in the experience. SimLab Composer will support voice commands in 20 languages The following example shows how voice commands can be used in the VR experience. The following tutorial shows how to add voice commands in SimLab Composer/Studio: AI Agents The user can make it work with 3D Avatars, by describing the agent role, talk to it and get answers in different languages. The following image shows how to set your OpenAI API key in the viewer With this feature To learn more about this feature check the following tutorial. Find & Control Searchable Components With a complex CAD model with good tree structure, this feature can be very useful. It allows users to search for objects then show/hide, isolate, highlight, and grab them while running the experience in the Viewer Training Builder Menu Training Builder is a visual tool that allows VR Experience designers to control the logic in the VR Experience without the need to write any code.Training Builder allows the user to link an Event with Response(s)An Event is fired when something happens in the experience, for example when the user clicks on a specific object, or when an object collides with another object. A Response is a reaction the VR Experience should do when an Event takes place. For example, when the user clicks on Globe Object (Earth_geo) Scene Node in the Node Triggered Event, it starts the rotation sequence (Play Sequence) Response, as shown in the following image; There is no limit to the number of elements in Training Builder. The user can add as many elements as needed in diagrams. Each Diagram can have up to 300 elements to keep things organized for big projects. For small projects one diagram should be enough, for larger projects the user needs to organize work by keeping up to 300 elements in each diagram. Creating your first VR experience can serve as a good first interaction with the Training Builder to see it in action: The following video provides more details about using Training Builder: SimLab Academy (in the Library panel) under VR Creation includes a section dedicated for Training Builder tutorials. Variables and Expressions Using Training Builder enables users to add variables and use them in creating VR Experiences. Check this tutorial about using variables in Training Builder. Supported variables are of the following types: String Variable: Stores object names, message data, ..etcNumber Variable: Supports both integer and float numbers. Can be used for calculating, and storing valuesTime Variable: Can save time at any stage of the VR Experience String Variables The initial value can be set to any string in the variable editor in the training builder.When used in Change Variable response, the string should be surrounded by single quotations (' ') Expression-supported operations include adding strings and substring Number Variables Initial Value can be set to any float or integer numbers, the following operations are supported for number variablesIncrement: Adds one to the current valueDecrement: Subtracts one from the current valueTime Difference: Calculates the difference in seconds (up millisecond precision) between two-time variablesExpression:  large number of expressions are supported, list of supported expressions can be found in the following link, The following tutorial shows how expressions can be used in Training Builder Time Variables The initial value for all time variables is set to the start time of the VR Experience, at any point of the VR Experience the user can capture the current time and store it in a time variable Variable writer This tool enables the user to track the value of a variable in the VR Experience The value of the variable is updated dynamically, so whenever the variable value changes the variable writer will be updated to show the new value of the variable. Variable Writer is available under the VR Effects Menu. The user can control its size, and place, so it can be aligned on a wall in the scene or on a monitor, user selects which variable to view, the color of the text, and the prefix text. Create/Edit Variables Variables can be created and edited by clicking the Edit Variables button at the top right part of the Training Builder. It allows the user to view/ delete/ change the initial values of existing variables, or create new variables. Clicking '+' at the top right corner of the Edit Variables and Initial Values dialog will add a new variable. The user can click to change the name of the variable, select its type from the combo box, and set its initial value. Inputs Inputs are entities used as triggers for actions in the Training Builder diagram. For example, if multiple nodes in the diagram use a Scene Node, it can be used as input and be connected to multiple blocks. Updating the input once will be reflected on all blocks using this input. Icons Inputs Name Sequence Scene State Scene Node (object) String Variable Action Boolean Hand Events Events are triggered when something happens in the VR Experience. The following image shows how to use Scene Start Event, which takes place as soon as the VR Experience starts, to play Sound Action Response.  Icons Events Name Scene Start Scene State Applied Sequence Ended Node Entered Node Node Exited Node Hand Entered Node Hand Exited Node Voice Command Recognizer Node Triggered Node Grab Started Node Grab Ended Delay Grip Pressed Hand Entered Hand Hand Exited Hand Voice Command Test To learn more about Grip Press event check this tutorial. Responses Responses are defined reactions to events. To link a Response to an event, the user needs to simply connect the Execute channel from the Event to the Execute channel of a Response. The user can connect the output Execute channel from a Response to the input Execute channel of another response to guarantee the order of execution and to link multiple responses to an Event. The diagram below shows responses for clicking on an object (Node Triggered)1- Play a Sequence2- Wait for 2 seconds3- Hide an object from the scene Icons Responses List Apply Scene State Change Node Grabbable State Play Sequence Stop Sequence Show/Hide Show Message Box Teleport to Camera Change Node Action Sound Action Video Action Report user-defined measurement Active Quiz/Survey Change Equipment State Change Grabbable Sequence Enable/Disable Physics Vibrate Controller Advanced Change Variable Value Branch (Checks value, if true follows one path, if false follows the other) Get Attribute Value Set Attribute Value Glow Object Un-glow Object Point To Object Remove Point To Object Get Position Set Position Fall to Surface Reset Rotation Set Parent Reset Parent To learn more about Vibrate response check this tutorial. Booleans Booleans include two groups States, and Operations. States are similar to events, and the current state can always be gotten from them. For example in the following image, Objects Overlap is shown, if the two objects Overlap you get OnTrue execution, as soon as they do not, you get OnFalse execution. The output Boolean value can be checked at any point to see if they are Overlapping on not. OnTrue or OnFlase are only fired when the state changes. Operations allow running Boolean operations on Boolean variables. Supported operations include And, Or, and Not Icons Booleans List Compare Variable Value Objects Overlap Object is Grabbed And Operation Or Operation Not Operation Templates Training Builder works great for describing multiple-step training scenarios. But what if the same behavior is repeated for a class of objects? For example, if the user ends up grabbing any of the tools in the training, the response should be for it to fall to the ground, this is when templates are used. To Create a Template diagram in the Training Builder click Create Template button at the top right side of the Training Builder. After clicking Create Template, a new Template Diagram is added. A template diagram is different from a regular Training Builder diagram in its orange background marks, and in Template Scene Node block added to it. This block can not be deleted. Template Scene Node block selects a group of Nodes based on a query that can include one or more rows connected with And or Or, as shown in the following image. Logic connected to the Templet Scene Node block will be applied to each Node that satisfies the selection query, for example in the previous image, each object having "box" in its name, and "wood" in the value of its material attribute will be selected. The following tutorial shows the power of Template Diagrams in the training builder Advanced Features Copy Part of a Diagram To repeat the same logic for more than one object (in case templates did not do the job), part of the diagram can be copied. This is done by using the Left Mouse button to highlight the part of the diagram to copy, while the section is highlighted click CTRL + C to copy it, then CTRL + V to Paste. After that, the Scene Node Object needs to be changed, and any block needs update. Drag/Drop of the Scene To add inputs like a Scene State or a Sequence, the following process is used: 1- Click on Sequence from the toolbar2- From Sequence Library select the desired Sequence3- Click attach Sequence As shown in the following image If you know the Sequence or the Scene State you can directly drag it from the Library and drop it on the Training Builder Right Mouse Clicking the Right click in the training builder shows all toolbar elements, organized, so you can add any block without moving the mouse to the toolbar. You can also click a few characters in the filter to find an element quickly, as shown in the following image: Attributes Attributes can be used with Training Builder in many ways: They can be used for creating Template diagrams They can be checked to determine behavior They can be used as local variables saved on each object. It is a good idea to be familiar with attributes to create advanced VR Experiences. To learn more about  attributes check the following tutorial: Enhancing Medical Training with VR Palpation Simulation The Training Builder Hand Source and Events for VR medical simulations empowers VR Experience designers with unprecedented control over advanced hand skills training, particularly in processes like patient palpation. Watch the demonstration in the video below to witness the immersive and customizable experience: Tools to organize Training Builder experience Some users are creating large and complex training, and virtual reality (VR) Experiences using the training builder. To help manage these complex diagrams, the following tools have been provided: Notes: Use notes to describe the logic, making it simpler for both you and others to understand and update the training builder diagrams in the future. Groups: You can organize nodes into groups and give them clear names. This makes it easier to find and update the logic of your training builder. Search: Suppose you've improved an animation sequence and want to replace the old one in the training builder. Use the search tool to find all instances of the old sequence and replace them with the new one. Dynamic Builder In template diagrams, a user was able to define logic for a group of objects based on name, or attribute criteria. Dynamic diagrams take this to the next level, they allow defining interaction between multiple templates, in a dynamic way. What this mean is that a user can set a dynamic criteria for triggering actions and responses mainly by defining multiple queries using multiple scene nodes. When you have access to the Dynamic Builder, the default diagram will be a Dynamic diagram, and to add a new one, click the highlighted Add a new dynamic diagram button at the top right side of Training Builder diagram. A new Dynamic Diagram will open, and the Dynamic Diagram menu will appear. Dynamic training builder menu is divided into four groups: Sources A source is an expression that will generate a value that can be used to trigger an event in the Dynamic Builder. Sources are divided into two tabs, Main Sources and Template Sources. Main Sources Icon Source Name Scene Node Scene State Expression Number Boolean Hand Sequence Variable String Template Sources Icon Source Name Scene Node Query User Query Events Events are triggered when something happens in the VR Experience. They are arranged into two tabs Main Events and Object Interactions Main Events Icon Event Name Scene Start Node Triggered Scene State Applied Sequence Ended Node Grab Started Node Grab Ended Grip Pressed Grip Released Voice Command Recognized Ticker Variable Changed Object Interactions Icon Event Name Node Enter Node Node Exited Node User Enter Node User Exited Node Node Hover Started Node Hover Ended Hand Enter Node Hand Exited Node Hand Entered Hand Hand Exited Hand Node Assembled Node Disassembled Right Mouse Events Not all events are shown in the diagram. More are put under mouse right click menu as shown below: Key Pressed/Key Released To learn more about this event check this tutorial. Responses Responses in SimLab Dynamic Builder are arranged in a number of tabs to make it easier to use. Main Responses Icon Event Name Play Animation Sequence Apply Scene State Set Parent Node Reset Parent Node Set Node Grabbable State Delay Branch on Expression Loop Show/Hide Set Node Glow State Enable Controller Vibration Disable Controller Vibration Get Cloud Attribute Set Cloud Attribute Open External Connection Send Message Delay Objects Behavior Animation Sequences Numbers String Time Variable Media Attributes User AI Agents Scene Management States The following video shows the behavior in action, if you do not get why this is useful, do not worry about it for now.In the future when you create more dynamic experiences, you will be glad to have this at your disposal. The following diagram shows how loop end is used, it is combined with the fact that in Dynamic Builder the show message strings can also be dynamic, so we can show a message depending on the execution of the experience Markups VR experiences are valuable assets that include custom logic created by users. As more users build advanced experiences—and as these experiences are shared and used over time by multiple creators—we’ve added colored groups and notes to help manage them better. These features make it easier to organize related logic into groups and add clear descriptions for each section in the Training Builder. Check this tutorial. VR Catalog Menu Generate Catalog With this feature, users can insert ready models into the scene such as tables, chairs, doors, trees, and much more directly in VR Viewer without the need to go back to SimLab Composer and run the Viewer again. Also, new models can be added to the Catalog and shared with others online. To create a VR Catalog in Simlab Composer/Studio for VR Viewer follow the steps below: Export your scenes/models as a .vrpackege into the same folder in your device as shown in the screenshot below: Create a preview image (PNG) for the catalog to display in the catalog menu in VR Viewer, a square image with a size of 256x256 pixels is recommended. Select Generate Catalog and select the folder containing the VR Packages, and select the preview image. Give a name to the catalog, and click Generate. The new "*.vrcatalog" will be created in the same folder where the "*.vrpackege" models are saved. From the object tree select Environment, then in the Properties dialog/VR tab, select Add from file (The folder icon) then select the (*.vrcatalog) created. The (*.vrcatalog) file can be uploaded online to a website, and its URL can be added by pressing the plus icon. Now all models in the Industrial catalog we created, will appear in the Catalog menu in the VR Viewer, just one click on the model to insert it into the scene. Generate Catalog List In this dialog the user can add more than one (*.vrcatalog) file from a local computer or website "as a URL link", then Save it as a .vrcataloglist.  This catalog list can be loaded into other scenes to be used in VR Viewer. The tutorial below shows how to use models from the VR Catalog in SimLab VR Viewer. The tutorial below shows how to load and use more than one VR catalog in SimLab VR Viewer. You can visit the VR Catalog web page on our website by this link. VR Viewer Menu VR Viewer menu is where the user can export 3D scenes into *.vrpackage file format, and view it in either Desktop Or VR Mode. Show in Viewer SimLab VR Viewer enables users to experience rich and interactive environments in three navigation modes Desktop: Game-like controls using keyboard and mouse, with viewer running on Windows or Mac. PC VR: For VR headsets connected to a PC like Rift S, VIVE, VIVE Pro, or Quest connected with a link cable Standalone VR: For standalone headsets like Pico, Quest (not connected to PC), Android, and iOS Texture Optimization More information about Textures Optimization can be found here Settings Optimize Package: Optimize package merges all objects sharing the same material into one object, which in turn speeds up the experience and smooths it significantly, especially when the model has a large number of objects. Optimize package will affect static objects only, so if you have objects that have animation, scene states, or are part of any interaction they will be excluded from the process and any interactions in VR will occur without any problems. VR Viewer Only: When checked, the exported VR Package can not be imported by SimLab ComposerEnable VR Menu: The experience designer can control what the user running the VR Experience will be able to view, or how the user can interact with the scene in the VR Viewer Open Viewer Opens the stand-alone VR Viewer, viewer can be then used to open a VR Package, or to share a VR Package with others. VR Environment Displays the Properties dialog at the right side of the application window. In this dialog with its different tabs the user can change settings for application/ VR environment. We will explain some settings which may not have a clear meaning: Under the HDR tab, "Auto Adjust HRD Brightness (VR)" will put a limit on the brightness of the HDR when running the VR Experience on the Viewer, if you want to have high brightness for the Viewer, you should disable this setting. The Ground tab affects only Rendering and thus is only found in Composer. Under the VR tab, let's clarify the following settings:"Enable VR flying": Since users can't switch between navigation modes while running experiences in VR mode, this is the setting to make the VR Experience run with flying navigation mode On."Use Light Maps": Allows you to use baked lighting, either created in SimLab Composer or imported with the model. Light maps simulate lighting effects without requiring real-time calculations, reducing performance demands and ensuring a smoother VR Experience, especially on lower-end devices."Multiple lights on Standalone Devices": This affects artificial lights (lights that you add from the "Create" menu), it will enable or disable them when specifically running them on standalone devices, since these lights can heavily affect the performance on them."Enable Highlights": This will enable/ disable the highlights that you see on objects that you can interact with or grab. "Accurate Physics": If you have physics in you experience, this will increase the accuracy by making more calculations for the simulation of physics, but with a cost on performance."Adding Catalogs": Catalogs are specialized asset collections that can be created in SimLab Composer and uploaded online for easy access. This feature allows you to link external assets to a scene, enabling users to import them dynamically during the VR experience runtime. Unlike standard assets, catalogs can be stored and retrieved from an online repository, making them accessible across multiple projects and devices. VR Display Settings The user can adjust the display settings in the VR Viewer using the below dialog. Check this tutorial for more information. Set Start Position Loads the Viewer Start model, clicking on any place in the scene after clicking Set Start Position will snap the Viewer Start model to the picked location. The Viewer Start model defines the start place for the VR Experience viewer and the orientation at the experience will start at. If Viewer Start model was already added to the scene, no new model will be added and the user can directly click to snap the Viewer Start model. Users starting the VR Experience without calling Set Start Position will automatically have Viewer Start model added to the scene and will be able to set its location in the scene. Viewer Start model can be used in Training Builder, for example, Viewer Start model is used in Node Entered Node Event, in Training Builder diagram, to turn lights on, when a VR Experience user enters a room. Create VR Camera Creating VR Camera adds a new VR Camera to the scene. The user should place the camera and set its orientation, currently selected VR Camera appears in wireframe mode. User can place multiple VR Cameras in the scene, while running the VR Experience the user can activate the Camera Teleport mode, by clicking on it, or using the shortcut (4), as shown in the following image When starting Camera Teleport mode, the scene turns into a ghost view, VR Cameras are shown as white spheres, and hovering over one of the spheres shows the view from the camera's perspective, clicking on the white sphere moves the user to the VR Camera's location. VR Camera can also be used in Training Builder, in the Teleport to Camera response block. When this response is called, the viewer is moved from the current location to the location of the VR Camera connected to the response block When moving the viewer to a VR Camera, we can select to align the viewer orientation with the orientation of the VR Camera, this is done by checking the option Apply Orientation in the VR Camera Properties Panel, as shown in the following image: VR Camera can also be linked to the Surveillance Plane under the VR Effects menu, to learn more about using VR Camera with a VR Surveillance camera use the following link Export VR Package Export the current scene as a VR Package (*.vrpackage), which can be opened using the free SimLab VR Viewer. When exporting the VR Package the user can set: Title Publisher Description Preview image This information will be visible in the VR Viewer to help users to find the correct VR Experience. Export VR Catalog Item VR Catalog Items are very similar to VR Packages, and they share the same extension (*.vrpackage), the only difference is that they will not require adding Viewer Start model, as they are not planned to be used as stand-alone VR Experiences, they are created to be used in VR Catalogs, visible while running the VR Experience, as the one shown in the image below: Creation of new VR Catalogs requires the ultimate edition of SimLab Composer Export EXE File EXE files can be exported from the Windows version of SimLab Composer, they are similar to VR Packages, as they include the current scene. EXE files can be shared with users who do not have VR Viewer installed.Upon clicking on the EXE, it will try to find VR Viewer, if found it will use it and open the scene. If VR Viewer was not found, it will automatically install the latest version of VR Viewer, then it will use it and open the scene, which makes it easier to share VR Experiences with new users. VR Store Designed to simplify the process for VR Experience designers to market and sell their creations. In earlier versions, we tested a VR warehouse model, but based on feedback from designers, it was clear that they preferred licensing their VR Experiences instead of selling zim files. This allowed them to retain copyrights for the zim files, enabling them to build new experiences on top of them or customize them for specific users. SimLab VR Store addresses these concerns by offering an alternative avenue for marketing and selling VR Experiences, while still maintaining the availability of traditional methods. This new feature is particularly advantageous for larger organizations, providing them with an easier way to sell their VR content and gain more visibility. Any experiences created using the free SimLab Composer and VR Studio, after undergoing the review process, will be showcased in the VR Store. These experiences can be either paid or free and will appear without watermarks. This flexibility is beneficial for new users who want to start showcasing their work without any initial financial investment, and it also caters to the needs of educational institutes. The VR Store is intended for creators, and we welcome any notes, requests, or feedback from users. Your input is valuable in helping us refine and improve this platform to better serve the VR community. Please feel free to share your thoughts with us. Check this article to learn how to protect your VR product and prevent Copyright infringement. Viewer Settings This option gives users quick access to viewer settings, to check and update those settings before exporting the VR Package. Languages This option allows the user to make the VR Experience support multiple languages, the user defines special objects for each language, those objects can be Objects, Videos, Sounds, Presentations, or Text. When starting the experience, the use will select the language he/ she prefers to run the experiences in, all objects in other languages will be automatically deleted, so the VR Experience will run in the selected language.The following video shows how to use multiple language support Mixed Reality Elements The Quest 2 and Quest 3 now offer mixed reality support, enabling users to engage in interactive experiences while maintaining visibility of their actual surroundings. With the Mixed Reality option, designers can specify the environmental elements within the experience. When users run the experience in Mixed Reality (MR) or Augmented Reality (AR) mode, these elements will be concealed, whereas they will be visible in other modes. This flexibility empowers designers to craft a unified experience that seamlessly transitions across Virtual Reality (VR), Mixed Reality (MR), or Augmented Reality (AR) modes. The following tutorial shows how to use the MR elements: In addition to single user MR mode, MR mode can be used in collaboration, the following tutorial shows how users located in the same room perform synchronization to work effectively in the MR mode.VR Evaluation The VR Evaluation menu gives the user access to the full VR Evaluation process which includes Creating Quizzes/SurveysDistributing VR experiences to studentsViewing Results VR Evaluation Chapter goes through the process of creating and distributing a successful VR Evaluation.