Integration of WPF, Unity3D (and Kinect)

I created a WPF project in Visual Studio. XAML markup is controlled by C # code. What I want to do is create a component in the user interface that will show the 3D scene. I would like this 3D scene to be controlled by Unity because I need to use the Unity physics engine. The user should be able to interact with this three-dimensional scene using gestures recognized by Kinect (for example, throwing a ball).

Is there a way to connect WPF, Unity3D and Kinect so that the user can manipulate the 3D scene this way? If yes, can you provide me some examples / tutorials? If not, what is the best approach that allows the user to manipulate a 3D scene using Kinect gestures?

+4
source share
2 answers

I would see How to get Unity3D View In wpf winform as for using Unity3d with wpf. The hard part is Kinect interaction.

You can interact with Kinect through WPF and use Zigfu for unity. The disadvantage of using WPF for Kinect interaction is that you cannot send data in a unit unless you use the new Kinect Client Server System and send all your information to a web server, and then get it in one. This is a very bad idea and probably won't be fast enough and experience a serious lag.

, ZigFu Kinect, . , Kinect WPF, . ,

, , . ZigFu, .

+2

. - , //COM UnityWebPlayerAXLib, .

0

Source: https://habr.com/ru/post/1543439/


All Articles