Access Current Frame on iOS

I am looking to either render on top of the camera stream or replace it entirely with custom video, I’ve seen this might be possible on web (like the blur effect) but I can’t find any information relating to iOS, is this possible?

THe SDK does not provide direct access to the video view. You’d probably need access to the raw video frames in order to do something like background blur, but that’s way too many layers down the stack.