Cross-platform 3d engine for embedding in iOS / Android-View?

For my new application, I want to have my own iOS or Android UI, as well as a three-dimensional view that shows some graphics. Although my initial plan to use CSS transforms in WebView to achieve the desired 3D effect failed, because the performance was not somewhere nearby, I'm looking for another solution. Here are what I think these are my options:

1. Restore a scene in OpenGL ES
I completely recreated our graphics engine in OpenGL. This is certainly possible, but it will probably be very difficult to do for people like me who have never actually done 3d programming before. Can I port an OpenGL code written for iOS to Android? Or will I need to create an engine twice?

2. Use a ready-made 3d engine similar to Unity3d
I came across Unity3d, Marmalade and similar tools that are likely to help me make 3D scenes easier. However, at first glance it seems that I will need to create the entire application in the corresponding authoring system. I would use my own controls for everything except the 3D view. Is this possible in these tools? If so, is it possible to transfer 3d from iOS to Android or vice versa?

I asked the right questions; Did I make the right assumptions? Does something sound wrong to you? What did I miss?

Do I have all the options? Or maybe something else, a hybrid solution of some kind?

Which option to choose?

Edit: To clarify, I don’t want to show iOS or Android controls in a three-dimensional view, but rather around it: the user interface will be native, and the three-dimensional part will be contained within one view.

+4
source share
1 answer

Pretty complicated -)

(1) OpenGL vs. engine:

I think that it would be possible to write the main part of OpenGL code in such a way that it is portable, for example, in C ++, using Objective-C ++ on iOS and NDK on Android (just an idea that never used NDK), Now it depends on the graphic part where to go. If you say graphs, do you mean that you only need graphical functions to plot the graph on the screen? Or do you have more objects, for example, in games or architectural applications for display in OpenGL?

If the latter is the case, the big problem is the seamless integration of models. Each scene containing more than cubes is intended for modeling and can be exported in several formats (obj, dae, fbx, ...). But then you need to import it into your application. That game engines start playing regardless of whether they will be Unity, SIO2, Bork, Unreal, Oolong, ...

If you need to build a graph, you can consider the manual OpenGL solution. Although even then, some research on the existing engine may be useful, because OpenGL is not so intuitive, and it will take you some time to make friends. I drew several graphs with scaling, coloring, ... in the iPhone application using OpenGL directly, but it was a pure research project, and portability was less important.

(2) Mixing OpenGL with GUI Elements

In principle, it is not possible to display a standard UIButton in an OpenGL view. But you can have OpenGL views and regular user interface views and switch between them. I don’t know of any tool that allows you to do both on the basis of OpenGL, and on a regular graphical interface, as well as on an independent platform.

So, on the one hand, you need to create your 3D material in the tool of your choice, and then integrate your own user interface code. There are some resources for Unity and iOS. How to do this ? How to access Unity assets natively on Android or iPhone? or is Unity mixing generated code with Objective-C in iOS?

On the other hand, you will need a platform independent platform to get regular GUI programming so that you can run it on both Android and iOS. I am not familiar with this topic, but I heard from Mono-based frameworks.

+2
source

Source: https://habr.com/ru/post/1386059/


All Articles