Possibility to share audio along with screen on iOS

I noticed that the library does not provide a way to ask for audio when accessing display media.

Furthermore, the suggested iOS Broadcast Extension implementation offered by Jitsi does not handle audioApp sample buffers.

I assume even if one will manage to send audio frames to the main application, they are not going to be processed by react-native-webrtc library.

Is it correct that it is not possible to share other apps’ audio during screenshare? If yes, what could be the potential options on implementing it, either within the library or as a separate solution?