FAQ2022-05-19T11:53:33-04:00
What is the MetaVRse Engine?2020-05-26T12:26:36-04:00

The MetaVRse Engine is a web-based tool designed to make XR and 3D content creation powerful, universal, and easy to use. It features a no-code/low-code template system, automatic embedding into any website or iOS/Android app, and a range of powerful tools to make lightweight and universal demos for business and education.

What platforms do you support?2020-05-26T12:28:09-04:00

The MetaVRse Engine automatically generates code for embedding 3D/XR experiences into existing websites as well as iOS and Android apps. We will offer VR and MR hardware integrations in the near future.

How robust is the engine?2020-05-26T18:32:45-04:00

The MetaVRse Engine is powered by an ultra-optimized proprietary rendering core that is capable of displaying over 1 million polygons with 16,000 textures on a mobile device.

Do you support mobile devices?2020-05-26T12:30:44-04:00

Demos created by the MetaVRse Engine can be accessed on any modern mobile device, such as an iPhone, iPad, Android phone, or tablet. However, you require a Mac, PC, or Linux computer with a modern browser to build with the engine.

What file formats do you support?2020-07-31T12:11:17-04:00

We support industry-standard 3D file and image formats. The current recommended formats include:

  • PNG
  • JPEG
  • HDR (for skyboxes)
  • FBX
  • OBJ

We are currently working on supporting additional formats, including GLTF and USDZ, as well as audio/video.

How does MetaVRse augmented reality work?2020-08-03T14:04:07-04:00

The MetaVRse Engine powers augmented reality experiences on Android and iOS, using the MetaVRse One App, a powerful yet lightweight utility application (16mb). The app is universal and opens AR experiences directly from the web. With the One App, creators have access to the full power of ARCore (Google) and ARKit (Apple) for the best possible AR experiences.

When launching our augmented reality experience for the first time, it will redirect you to Google Play or the App Store. Once you download the app, open your experience again on the web and click on the AR button.

One App is designed to be universal, interoperable, and almost invisible, allowing for a feature-rich, universal experience. It can also be whitelabelled for custom deployments.

How can I use textures baked with lighting, shadows, etc.?2020-06-15T07:37:39-04:00

The Metavrse Engine supports textures with baked-in features (such as lights and shadows) and does not apply any preferential treatment to textures with baked-in features. Simply Drag and Drop your texture onto the corresponding mesh and everything should automatically get applied. Note that most textures use the R channel by default unless specifically stipulated during the creation process; the only exception is Opacity where the A (Alpha) channel is used by default to represent transparencies.

If my lighting is baked into my textures, do I still need any lights in the scene?2020-06-15T07:38:29-04:00

Yes, textures with baked-in features still require Scene lights and/or a skybox environment to be visible. Baked features do not actually light up textures, objects, or scenes; they are illusions to represent a particular lighting effect.

Do you provide integration with face and body tracking?2020-06-18T09:41:26-04:00

We currently do not have the ability to perform body and facial tracking. That being said, our system has the capabilities required to perform both tasks, and an out-of-the-box integration is part of our mid-term roadmap. Our development priorities are responsive to customer needs.

Why do you use device-based rendering rather than cloud-based rendering?2020-06-18T09:44:24-04:00

MetaVRse Engine rendering takes place on the local host device (phone, laptop, tablet etc.). Our cloud services are modeled around the seamless hosting and serving of 3D experiences to all devices. We have discussed hybrid rendering to offload some of the heaviest tasks to the edge; however at this time it is not our primary focus, as the 5G infrastructure required is still in its infancy.

On a broader level, there are advantages and disadvantages to both approaches. Rendering on local devices is a more reliable and stable solution that is not so dependent on a steady super-fast connection. As personal devices get more powerful every year, their capabilities to render complex 3D experiences increases exponentially. Cloud rendering, on the other hand, has limitless computing potential, and with a stable and fast 5G connection can render life like experiences. The downfall is that it is still prohibitively expensive for the average user to pay for continuous access to cloud-enabled high-power GPUs, and stable 5G-connected devices and infrastructure are still very scarce.

Am I able to create a frontend editor on top of your solution?2020-06-18T09:45:02-04:00

Absolutely, our platform is fully modular and flexible, ready to adopt custom HUDs and frontend solutions. Everything from the frontend perspective is fully customizable and client-oriented. If you require a custom solution, please contact us.

What should I do if we need capabilities that the MetaVRse platform doesn’t provide?2020-06-18T09:45:29-04:00

Our platform is modular and fully customizable to the needs of our clients. If there are missing features or capabilities, we would be happy to do a custom integration. Please let us know your requirements and challenges by contacting us.

Can animated files be imported and work within the MetaVRse Engine?2020-07-03T13:17:53-04:00

Our system supports .FBX files with full animations! When you Import a file into our Editor, all the contained animations are accessible through the “Animations” tab. Animation can also be accessed through our JavaScript editor conveniently built into the Editor.

What should I do if the texture settings are different from what I expect?2020-07-03T13:23:36-04:00

Our system supports two different rendering modes: Blinn-Phong and physically based rendering (PBR). Diffuse, Reflections, and Specular are all part of the Blinn-Phong shader, which does not support PBR; as a result, it has an alternate set of standards when referring to the naming conventions of the texture files. The MetaVRse Engine is able to toggle between these two standards.

Once PBR is turned off the names of the Drag and Drop windows will change to Diffuse, Reflections, and Specular. Our system is also capable of interpreting the Blinn-Phong textures as PBR and can be applied in the following way:

  • Diffuse = Albedo
  • Reflections = Roughness
  • Specular = Has no equivalent and is therefore not utilized
How can a user control an avatar in the environment?2020-08-27T11:49:15-04:00

With a mouse and keyboard, or a touch-enabled experience (somewhat like Fortnite). All avatars can be animated and controlled in our web Editor. We can also build out those feature sets as per client requirements.

Can video be mapped onto an object?2020-08-27T11:50:10-04:00

Yes all textures and videos can be mapped onto any 3D object. Our solution does not offer UV mapping functionality, as this is available on all 3D creation tools (Maya, 3D Max, Cinema 4D, Blender etc.). One the UVs have been mapped, it’s a simple drag and drop scenario in our Web Editor.

What’s the best way to create an environment that can be navigated?2020-08-27T11:52:27-04:00

Yes, all actions can be invoked from a 3D environment; this includes loading other scenes, more/new 3D objects, changing lights, skyboxes, colours, textures, etc. Note that exploration and movement through an environment would involve one of more 3D objects, rather than a skybox.

Got a question for our team? Tweet us @metavrse or give us a shout!