Jetpack Compose supports embedding native Android Views (e.g. WebView).
In Flutter there are discussions on the rendering techniques used to render and compose (compose as in "Image Layer Composition") those views, for example here, here, and here.
For a quick example of what I am looking for, here is an excerpt:
AndroidView
widgets need to be composited within the Flutter UI and interleaved between Flutter widgets. However the entire Flutter UI is rendered to a single texture.In order to solve this problem, Flutter inflates and renders
AndroidView
widgets inside of VirtualDisplays instead of attempting to add it to the Android View hierarchy alongside Flutter's texture directly. TheVirtualDisplay
renders its output to a raw graphical buffer (accessed throughgetSurface()
), and not to any actual real display(s) of the device. This allows Flutter to graphically interleave the Android View inside of its own Flutter widget tree by taking the texture from theVirtualDisplay
output and treating it as a texture associated with any other Flutter widget in its internal hierarchy. Then theVirtualDisplay
's Surface output is composited with the rest of the Flutter widget hierarchy and rendered as part of Flutter's larger texture output on Android.
I don't see any discussion around this topic in Jetpack Compose documentation. How exactly does Jetpack compose achieve the same effect? For instance:
- Does it first draw the native views to a separate Texture or Surface?
- etc.