G’day. Here’s what I discovered this week in 3D…
Quelle:
Foto: https://www.youtube.com/watch?app=desktop&v=80GrFXFOayE
G’day. Here’s what I discovered this week in 3D…
Quelle:
Foto: https://www.youtube.com/watch?app=desktop&v=80GrFXFOayE
Nvidia hat auf der Siggraph Erweiterungen seiner Simulationsplattform Omniverse vorgestellt. Dazu gehören Integrationen mit dem Open-Source 3D-Animationstool Blender und den 3D-Anwendungen von Adobe.
Mit seiner erweiterten Simulations- und Collaboration-Plattform Omniverse will Nvidia dem Metaversum Millionen neue Nutzer zuführen. Blender, ein Open-Source-3D-Animationstool, unterstützt ab sofort Universal Scene Description (USD) und ermöglicht 3D-Künstlern den Zugang zu Omniverse-Produktionspipelines. Adobe arbeitet gemeinsam mit Nvidia an einem Substance 3D-Plugin, das die Plattform um die Unterstützung von Substance-Materialien erweitert und damit neue Möglichkeiten der Materialbearbeitung für Omniverse- und Substance 3D-Anwender eröffnet.
Die Simulationslösung ermöglicht es Designern und Künstlern, in Echtzeit über gängige Softwareanwendungen in einer gemeinsamen virtuellen Welt von überall aus zusammenzuarbeiten. Experten in über 500 Unternehmen, darunter SHoP Architects, South Park und Lockheed Martin, evaluieren die Plattform. Seit dem Start der offenen Beta-Phase im Dezember wurde Omniverse von über 50’000 Entwicklern heruntergeladen.
„Nvidia Omniverse verbindet Welten, indem es die Vision des Metaversums zur Realität werden lässt“, sagt Richard Kerris, Vice President der Omniverse-Entwicklungsplattform bei Nvidia. „Mit dem Input von Entwicklern, Partnern und Kunden entwickeln wir diese revolutionäre Plattform weiter, so dass jeder, von Einzelpersonen bis hin zu großen Unternehmen, mit anderen zusammenarbeiten kann, um erstaunliche virtuelle Welten zu erschaffen, die genauso aussehen, sich anfühlen und verhalten wie die physische Welt.“
Das Metaversum ist eine immersive und vernetzte gemeinsame virtuelle Welt. In ihr können Künstler einzigartige digitale Szenen erschaffen, Architekten können Gebäude entwerfen und Ingenieure können neue Produkte für den Haushalt entwerfen. Diese Kreationen lassen sich dann in die reale Welt übertragen.
Die Simulationsplattform verfügt über ein schnell wachsendes Ökosystem von unterstützenden Partnern. Der Schlüssel zur Akzeptanz der Plattform in der Branche ist Pixars Open-Source-USD (Universal Scene Description) — die Grundlage der Kollaborations- und Simulationsplattform. Sie ermöglicht es großen Teams, gleichzeitig mit mehreren Softwareanwendungen an einer gemeinsamen 3D-Szene zu arbeiten. Dieser offene Standard bietet Software-Partnern mehrere Möglichkeiten, Omniverse zu erweitern und sich mit ihm zu verbinden, sei es durch die Übernahme und Unterstützung von USD, durch die Entwicklung eines Plug-ins oder durch einen Connector.
Apple, Pixar und Nvidia haben zusammengespannt, um fortschrittliche Physikfunktionen in USD einzubringen und offene Standards zu nutzen. Somit lassen sich 3D-Workflows für Milliarden von Geräten bereitstellen. Blender und Nvidia werden die USD-Unterstützung für die kommende Version von Blender 3.0 bereitstellen, und damit für die vielen Künstler, welche die Software verwenden. Nvidia steuert USD- und Materialunterstützung in Blender 3.0 alpha USD bei, das bald für Entwickler verfügbar sein wird.
Nvidia und Adobe arbeiten gemeinsam an einem Substance 3D-Plug-in, das neue Materialbearbeitungsmöglichkeiten für Omniverse- und Substance 3D-Benutzer freischalten wird. Künstler und Kreative werden direkt mit Substance-Materialien arbeiten können, die entweder von der Substance 3D Asset-Plattform stammen oder in Substance 3D-Anwendungen erstellt wurden, wodurch ein nahtloser 3D-Workflow entsteht.
Das Ökosystem wächst weiter und verbindet branchenführende Anwendungen von Softwareunternehmen wie Adobe, Autodesk, Bentley Systems, Blender, Clo Virtual Fashion, Epic Games, Esri, Golaem, Graphisoft, Lightmap, Maxon, McNeel & Associates, Onshape von PTC, Reallusion, Trimble und wrnch Inc.
Lockheed Martin verwendet die Simulationslösung bei der Simulation, Vorhersage und Bekämpfung von Waldbränden. Das KI-Exzellenzzentrum von Lockheed Martin hat zusammen mit Nvidia eine Initiative mit dem Namen Cognitive Mission Management gestartet, um eine Strategie für Notfallmaßnahmen und Brandbekämpfung zu entwickeln.
„Der Einsatz von Nvidia Omniverse und KI zur Simulation von Waldbränden wird uns dabei helfen, vorherzusagen, wie schnell und in welche Richtungen sie sich ausbreiten, und die Auswirkungen von Umweltvariablen wie Wind, Feuchtigkeit und Bodenbedeckung auf ihr Verhalten zu erforschen, damit die Feuerwehrteams besser reagieren und ihre schädlichen Auswirkungen verringern können“, sagt Shashi Bhushan, leitender KI-Software- und Systemarchitekt bei Lockheed Martin. „Unser Ziel ist es, Modelle zu entwickeln, die Sensordaten und Echtzeitanalysen koordinieren, um eine KI zu entwickeln, die das menschliche Eingreifen auf verschiedenen Ebenen steuert.“
SHoP Architects mit Sitz in New York nutzt Omniverse für die Zusammenarbeit und Visualisierung in Echtzeit. Mit Omniverse kann das Team des Büros die richtige Software für die richtige Aufgabe verwenden und alle daraus resultierenden Daten an einem Ort zusammenführen, wo sie über die Plattform erweitert und eingesetzt werden können.
South Park, die seit langem laufende und mit dem Emmy ausgezeichnete Zeichentrickserie, nutzt die Simulationsplattform, um mehreren Künstlern die Zusammenarbeit an Szenen zu ermöglichen und die wahnsinnig knappe Produktionszeit zu optimieren.
Die Enterprise-Version befindet sich derzeit im Early Access. Die Plattform wird im Laufe des Jahres auf Abonnement-Basis über das Nvidia-Partnernetzwerk — darunter Asus, Boxx Technologies, Dell Technologies, HP, Lenovo, PNY und Supermicro — erhältlich sein.
Informationen zu den Preisen finden Sie hier.
Quelle:
Some days ago, the popular opensource 3D modeling software Blender has launched a new VR view tool to let you see your 3D scene in VR! You could already use Blender in XR with the MARUI plugin, but here we’re talking about the official support by the Blender Foundation, that comes with version 2.83.
All the 3D Artists’ community got hyped about this because it is very cool. Do you want to try it yourself? Guess what, the Ghost has got you covered!
Here you are my usual video tutorial where I explain to you how to activate the VR view feature in Blender, with what devices it works and how it performs in action. It’s a very easy process, so the video is quite short. Enjoy!
There is a little error regarding configuration of WMR headsets in the video. Refer to the guide below for Mixed Reality headsets
Let’s dig into this new VR feature for Blender.
Blender VR works through the OpenXR standard, for which a runtime has been implemented only for Windows Mixed Reality headsets and Oculus headsets. So, if you have one of these headsets you can go on, otherwise you have to wait for Valve to adapt its SteamVR runtime to OpenXR.
Blender also advises you to have a PC running Windows.
To try VR visualization of your 3D scenes in Blender, you have to install Blender version 2.83 or above.
The little problem is that at the time of writing, on the official downloads page, you just see version 2.82a… so where the heck can you find version 2.83? Easy peasy, on the page dedicated to the Daily Builds of the software, where you can already find the alpha version of Blender 2.83.
Download it and save it onto your PC, then unzip the package wherever you want Blender to be: this isn’t an installer, but a portable version of Blender, so mind where you unzip this folder.
Open the folder you have just unzipped, but don’t launch “Blender.exe” as usual, but follow the instructions described at this page to launch Blender in the correct way to use it with VR. Let me copy them here below to semplify your life.
To check if a PC meets the requirements to run Windows Mixed Reality, Microsoft offers the Windows Mixed Reality PC Check application.
...
in the lower left corner. In the menu it opens, select the Set up OpenXR. Windows Mixed Reality is now ready to be used with OpenXR.For more information, refer to Windows’ Getting Started Guide for Windows Mixed Reality and OpenXR.
After you have configured WMR this way, you can finally launch the Blender.exe executable.
Oculus only provides prototype OpenXR support. To use it, Blender has to be started in a special way, as described below.
blender_oculus.cmd
script inside the installation directory. It will open a command-line window with further information.Once you have opened Blender, you have to activate the VR viewer add-on. So go to Edit -> Preferences… and in the dialog that pops-up select the “Add-ons” tab.
In the “search” inputbox (in the upper right corner) look for “VR” and then click the checkbox next to “3D View: VR Scene Inspection” to activate the add-on.
Compliments, now Blender is ready to show your content in VR!
Now, if you want to activate the VR view, you have to click the little “<” in the right of your 3D view of the world to show the lateral toolbar.
Select the VR tab, and now you can see some options related to the configuration of your VR visualization. You can use this webpage as the documentation that explains to you the meaning of all those options (even if some of them are self-explanatory).
There is one parameter that is “Camera”, that lets you choose the camera in the scene through which you’ll have the VR view. By default, it selects the main camera of the scene, but if you want to use a special view for VR, you can. Select Type “Custom Camera”, then add a new Camera to the scene, and then select here the one through which you want to see your 3D scene in VR.
At this point, click “Start VR session” to see your scene in virtual reality! While you explore the environment in VR, you can also use your mouse on your desktop screen to move the camera in the scene to move your view in VR!
If you don’t know where to find some cool Blender scenes to watch in VR, you can find some demo assets directly on Blender website.
I was really amazed by the easiness of downloading Blender and configuring it for VR. I’ve used it with Oculus Quest + Link and I can tell you that it worked really well.
Quelle:
https://skarredghost-com.cdn.ampproject.org/c/s/skarredghost.com/2020/04/10/how-to-blender-vr/amp/
Blender, a popular free open-source modeling and animation tool, is soon to get VR support via the OpenXR API. Initially users will be able to step into their 3D scenes to see them up close and at scale, with new features expected in the future.
The next version of Blender, version 2.83 planned for release in late-may, will include a first wave of VR support, the company recently announced. VR support is being added via the OpenXR API, which will allow the software to interface with any headset supporting OpenXR (which has wide support in the VR industry, though is still in the early process rolling out to individual headsets).
Initially, Blender’s VR support will only allow for scene inspection, which means users can look at their creations up close and at scale. For those using Blender to create assets and animations for use in VR games, being able to see their creations in-headset before being imported into a game engine could help streamline the production process.
More VR features are expected to be added in the future. Last year a Blender developer said “We have an awesome team of people interested in working on [XR].”
As Blender is a very complex piece of software, it’s unlikely that the full feature set (or even half of it) will be functional in VR, however it’s conceivable that in the future users might also be able to move objects in their scene in VR, and perhaps even do basic modeling and things like rigging and ‘puppeteering’ for animation.
Well before Blender’s upcoming official VR support, third-party plugins like Blender XR by MARUI have already made it possible to view and interact with Blender scenes in VR. However, official support, especially via OpenXR, should help futureproof the feature by ensuring compatibility with future headsets.
Quelle:
Open-source Modeling & Animation Tool ‘Blender’ to Get VR Support via OpenXR
Maya, the paid tool from Autodesk, and Blender, a free and open source alternative, are two of the most popular 3D creation tools in the industry. Maya is the tool used to create the 3D assets for countless films, TV shows, and video games. Blender is used by hobbyist projects, but is also used by large organisations such as NASA.
Because neither Maya nor Blender support virtual reality, Japanese startup MARUI-PlugIn developed plugins for each application to add VR support; which they call ‘MARUI’ and ‘BlenderXR’. MARUI is a paid plugin ($50/month or $550 lifetime) for Maya, whereas BlenderXR, just like Blender itself, is free and open source with optional donations supporting it.
MARUI supports Oculus Rift, HTC Vive, and all Windows MR headsets, while BlenderXR supports all these and the eye tracking PC VR headset ‘FOVE 0’ too.
For many creators, being able to see and manipulate assets at real scale directly with your hands and to look around it by simply moving your head is a paradigm shift from current monitor-based workflows. The folks behind the MARUI-Plugin claim that VR can reduce the cost of 3D production by up to 50%. The plugins aren’t focused on the full range of features for Maya or Blender. Instead, the company is focusing on design and animation. Using your hands to directly manipulate parts of the model that should move can be far more intuitive than the current approach of trying to move and rotate elements in 3D space with a mouse & keyboard.
At Oculus Connect 5, Oculus introduced a system for Rift called “Hybrid Apps”, which could be useful if the approach sees adoption by the likes of Blender or Maya in VR. That still hasn’t happened, so it looks like 3rd party plugins will be the go-to approach for now. For BlenderXR, the plug-in builders are embracing the community spirit of Blender by polling the community as to which features should come next. For MARUI, development will follow the priorities of its paying customers. Recently they added voice recognition and direct 3D sketching for Maya like Google’s Tilt Brush and Facebook’s Quill.
It is not yet known how widely the Maya and Blender userbase will embrace VR. Perhaps headsets aren’t high resolution enough yet, or perhaps switching between tasks which are better on a monitor and tasks which are better in VR is not yet seamless. While MARUI and BlenderXR look like the first real steps toward bringing these tools into spatial computing, we expect to see many more efforts in the coming years.
Quelle:
https://uploadvr.com/marui-plugins-bring-vr-support-to-3d-tools-maya-and-blender/