Can someone explain how the server is going to know whether or not the client browser is showing the ad? A stealthy browser would say, “hey yeah send that ad so I can render it to the user” and the server says, “yeah ok” and then <doesntRenderAdOnClientDevice>. How is the server going to know whether the ad is displayed or not? Don’t current gen adblockers not even retrieve the asset? If the asset was retrieved but not displayed, how (if even) can this be monitored?
<types in safari> But what about Apple?!?! Can’t they strong arm this?