The MetaVRse Engine is a web-based tool designed to make XR and 3D content creation powerful, universal, and easy to use. It features a no-code/low-code template system, automatic embedding into any website or iOS/Android app, and a range of powerful tools to make lightweight and universal demos for business and education.
The MetaVRse Engine automatically generates code for embedding 3D/XR experiences into existing websites as well as iOS and Android apps. We will offer VR and MR hardware integrations in the near future.
The MetaVRse Engine is powered by an ultra-optimized proprietary rendering core that is capable of displaying over 1 million polygons with 16,000 textures on a mobile device.
Demos created by the MetaVRse Engine can be accessed on any modern mobile device, such as an iPhone, iPad, Android phone, or tablet. However, you require a Mac, PC, or Linux computer with a modern browser to build with the engine.
We support a full range of industry-standard 3D file formats, as well as a variety of audio, video, and image formats. These include:
The MetaVRse Engine powers augmented reality experiences on Android and iOS, using the MetaVRse One App, a powerful yet lightweight viewer. When launching our augmented reality experience for the first time, it will redirect you to the Google Play or App Store. Once you download the app, open your experience again on the web and click on the AR button.
One App is designed to be interoperable with the augmented reality solutions developed by Apple (ARkit) and Google (ARCore), allowing for a feature-rich, universal experience. It can also be whitelabelled for custom deployments.
The Metavrse Engine supports textures with baked-in features (such as lights and shadows) and does not apply any preferential treatment to textures with baked-in features. Simply Drag and Drop your texture onto the corresponding mesh and everything should automatically get applied. Note that most textures use the R channel by default unless specifically stipulated during the creation process; the only exception is Opacity where the A (Alpha) channel is used by default to represent transparencies.
Yes, textures with baked-in features still require Scene lights and/or a skybox environment to be visible. Baked features do not actually light up textures, objects, or scenes; they are illusions to represent a particular lighting effect.
We currently do not have the ability to perform body and facial tracking. That being said, our system has the capabilities required to perform both tasks, and an out-of-the-box integration is part of our mid-term roadmap. Our development priorities are responsive to customer needs.
MetaVRse Engine rendering takes place on the local host device (phone, laptop, tablet etc.). Our cloud services are modeled around the seamless hosting and serving of 3D experiences to all devices. We have discussed hybrid rendering to offload some of the heaviest tasks to the edge; however at this time it is not our primary focus, as the 5G infrastructure required is still in its infancy.
On a broader level, there are advantages and disadvantages to both approaches. Rendering on local devices is a more reliable and stable solution that is not so dependent on a steady super-fast connection. As personal devices get more powerful every year, their capabilities to render complex 3D experiences increases exponentially. Cloud rendering, on the other hand, has limitless computing potential, and with a stable and fast 5G connection can render life like experiences. The downfall is that it is still prohibitively expensive for the average user to pay for continuous access to cloud-enabled high-power GPUs, and stable 5G-connected devices and infrastructure are still very scarce.
Absolutely, our platform is fully modular and flexible, ready to adopt custom HUDs and frontend solutions. Everything from the frontend perspective is fully customizable and client-oriented. If you require a custom solution, please contact us.
Our platform is modular and fully customizable to the needs of our clients. If there are missing features or capabilities, we would be happy to do a custom integration. Please let us know your requirements and challenges by contacting us.
Our system supports two different rendering modes: Blinn-Phong and physically based rendering (PBR). Diffuse, Reflections, and Specular are all part of the Blinn-Phong shader, which does not support PBR; as a result, it has an alternate set of standards when referring to the naming conventions of the texture files. The MetaVRse Engine is able to toggle between these two standards.
Once PBR is turned off the names of the Drag and Drop windows will change to Diffuse, Reflections, and Specular. Our system is also capable of interpreting the Blinn-Phong textures as PBR and can be applied in the following way:
- Diffuse = Albedo
- Reflections = Roughness
- Specular = Has no equivalent and is therefore not utilized