Portals with Asymmetric Projection
Portals are often used in immersive experiences: linking together environments, acting as gateways to different worlds, or filters on a current one.
When both the outer world, and the inner portal world share a rendering context, both worlds can be rendered on top of each other, obscuring the portal world outside of the visible gateway via stencil buffer or other techniques. Sometimes the portal world is rendered to a flat quad first before being viewed in an outer world.
Sometimes the outer world is the real world in a head-tracked environment (e.g. head tracking with a Wii remote, or with an iPhone X), or the contexts are completely separate. The camera rendering the portal content will need to use a projection matrix that accounts for its position relative to the viewer.
Render cameras can be defined by near and far planes, and how far the edges of the view are from the focal point along the near plane. Most commonly, projections are symmetrical, meaning the left and right edges of the view are equidistant from the focal point in opposite directions, and same with the top and bottom. In asymmetric projections, the horizontal or vertical extents (or both!) are unbalanced, resulting in the focal point being off-center.
Creating the Portal Projection
Robert Kooima's generalized perspective projection is the definitive writing for this problem, which explains how to compose transformation matrices for a system with 12x5 LCD screens that render relative to a viewer's head mounted display (HMD). This can be simplified a bit by abstracting away some steps inside a 3D engine (e.g. view matrix), and exploit the requirement that the portal camera must be oriented to face the portal plane.
Both the outer world and portal world need to use a shared coordinate system, and the portal camera's pose can be calculated by taking the following steps:
- Set portal camera position to the viewer camera position.
- Orient the portal camera to face perpendicular to the XY plane that the portal lies on.
- Align the portal camera's extents to the portal gateway extents.
Notice that in Figure 1, the portal camera (middle panel) follows the viewer camera (left panel) in an arc, but is always oriented towards the closest point on the XY plane where the portal lives.
Following the viewer camera is trivial, but finding the closest coplanar point to the portal is a bit more work, further elaborated in Kooima's generalized perspective projection. Luckily, since a camera's forward vector is
-Z, and the portal's forward is
Z, applying the portal's rotation on the portal camera results in the portal camera facing perpendicularly to the portal's plane.
This trick only works when the camera is in front of the portal (e.g.
portal.position.z < 0 in camera space). Otherwise, flipping some negatives or using Kooima's paper is necessary, but this should work for most cases.
The following code examples use three.js, and the portal is represented as an object with a position and orientation, with its scale representing its dimensions.
Now the portal camera's extents need to extend to include the portal. Extents are measured from the focal point, or where the camera's direction and portal plane intersect. Note that extents may not have opposing signs, e.g. viewing the portal from the right can result in both the right and left extents being negative (see the middle panel in Figure 1).
Rather than storing 3D vectors for the edges of portals, storing as a 4x4 matrix (or position, rotation and scale in these examples) provides a few more shortcuts. Calculating extents in camera space is simplified because the portal camera is oriented perpendicularly to the portal plane. The portal's forward in camera space is
(0, 0, 1) and lays on the camera's XY plane. In camera space, the extents can be calculated from the portal's dimensions, and
const portalHalfWidth = portal.scale.x / 2; const portalHalfHeight = portal.scale.y / 2; const portalPosition = new Vector3().copy(portal.position); portalCamera.updateMatrixWorld(); portalCamera.worldToLocal(portalPosition); let left = portalPosition.x - portalHalfWidth; let right = portalPosition.x + portalHalfWidth; let top = portalPosition.y + portalHalfHeight; let bottom = portalPosition.y - portalHalfHeight;
The above extents are calculated from the focal point to the portal edge on the portal's plane,
Math.abs(portalPosition.z) units away from the camera. Extents are calculated from the near plane, however, so it is necessary to scale the extents from the portal plane to the portal camera's near plane.
As the camera is looking down the -Z axis, we can get the absolute value of
portalPosition's Z value to find the distance between camera and portal plane, which is also the distance from the camera that the extents were calculated at. Since frustum extents are calculated at the near plane, we can specify a near plane for the perspective projection and scale our extents accordingly.
const near = 0.01; const distance = Math.abs(portalPosition.z); const scale = near / distance; left *= scale; right *= scale; top *= scale; bottom *= scale;
With all of the extents, scaled to the near plane, it's possible to construct a projection matrix.
camera.projectionMatrix.makePerspective(left, right, top, bottom, near, far);
With the correct projection matrix set up, it's now possible to render a portal scene with the portal camera, and use that texture to be displayed elsewhere.
This technique can be used as a simplification of Kooima's work in modern 3D engines to set up a camera to render a texture that will be viewed from a different angle. These ideas and the following considerations are a valuable tool when creating immersive, spatial experiences across mixed platforms, ultimately using textures as an interface.
Alternatives & Considerations
near == distance
In the above solution, the extents are calculated from the closest point that is coplanar to the portal,
d units from the camera, then scaled to some near plane
n units from the camera. It's also possible to have the near plane
d units from the camera, and not scaling the extents closer to the camera. This has the effect of clipping anything from the portal world that is attempting to render "outside" of the portal.
Portals vs Windows
The examples shown have a static portal, although the technique also works if the portal was moving. If the portal's coordinate system is static and does not move with the portal, it appears as more of a window shifting along an overlapping world, like a filter. Moving the coordinate system with a portal gives more of a portal effect, such that the gateway is always facing the same direction, coincidentally like the game Portal.
For rendering a portal in VR, the same technique can be applied to both viewer cameras, one for each eye. Note that the webview-in-Unity videos are only rendering a single webview, resulting inaccurate projections for true stereoscopic experiences (but looks great as a video!).
Resources & References
- Robert Kooima: Generalized Perspective Projection
- Paul Bourke: Stereographics Papers
- Peder Norrby: Illusion of depth by 3D head tracking on iPhone X
- Johnny Lee: Head Tracking for Desktop VR Displays using the Wii Remote
- Unity Wikibooks: Projection for Virtual Reality