Welcome to Inside North Point Ministries Production! We have created this site to provide insight into our current projects and what we are currently learning as we strive to create an unforgettable experience for our guests each Sunday.

Our mission is to lead people into a growing relationship with Jesus Christ by creating a pure technical experience each Sunday. The lights, stage crew, props, videos, and other elements are all part of the technical experience. It is a win for our team if none of these detract from our guests’ experience.

Multisite Transmission Explained


Our current multisite transmission system connects seven campuses around the greater Atlanta area. North Point Community Church serves as our central hub of communication and signal routing. Buckhead Church, Browns Bridge Church, Gwinnett Church Sugar Hill, and Woodstock City Church are all connected via a fiber network leased monthly from various telecom vendors.
The feeds to Decatur City Church and Gwinnett Church Hamilton Mill are delivered solely over the Internet with equipment that aids in packet recovery and forward error correction. Each location is equipped with similar video gear including cameras, switchers, projectors, and screens. This allows most venues to both originate and receive content in the same way. However, each auditorium is operated independently with its own video control room and personnel to facilitate live local service elements including music, announcements, and video playback. Additionally, each system is able to record and play back transmissions from any other location in a time-delay, DVR-like scenario.

The Virtual Set Model

The message is the primary portion of our service transmitted between campuses. We utilize two different video paths to create a virtual teaching experience for each local audience. Along with a standard IMAG video cut, a static center camera shot captures the entire stage. When displayed together on multiple screens, the result is perceptually identical to the original environment. This virtual set is displayed the same way in every location.

There are two screens (10.5’ x 18.8’) on either side of the stage that display close-up camera shots and graphics. Along with the side screens, there is a single, large format screen (16’ x 28’) mounted center stage that extends from the proscenium ceiling to the floor. The image projected here is a simple, static, centered wide shot of the stage that does not move or change composition during the service. It displays a comprehensive yet intentional view of what happens on the stage. It is composed in a way that portrays the communicator in lifelike proportion—literally walking back and forth and about 6′ tall. The combination of all three screens at a viewing campus creates a “virtual” copy of the original, live environment.

The IMAG System

The IMAG video system at each campus consists of four to six cameras, graphics computers, and video playback servers that are all connected to a switcher. The collection of images generated by the switcher, often referred to as Image Magnification (IMAG), is then fed through a router to multiple in-house destinations, including side screen projectors, flat panel displays, and digital recorders. At receiving locations, the incoming IMAG feed is recorded on the first channel of a local time-slip server. It can then be triggered on demand to replay the content through the local switcher to the side screens.

The Center Cam System

The center camera system at each campus consists of a single camera mounted on a robotic pan/tilt system in the center of the auditorium. Parallel to, but separate from, the IMAG feed, this consistent wide shot is also fed through a router to multiple destinations, including digital recorders and other locations. The center camera signal is recorded locally on the second channel of the same time-slip server. It is then triggered, in sync with its IMAG counterpart, to play the center content on the separate center screen.


The systems at each campus were built on an SDI digital video platform, supporting eight channels of embedded audio. When the time came to connect the church locations, this existing infrastructure made the choice of transmission method quite clear. Through local telecom providers, we lease “dark” fiber paths between locations. Essentially, this means they provide the physical fiber optic patch, and we are responsible for the gear that creates and manages the “light.”

The biggest advantage of fiber is its scalability. If we need to connect locations in new ways, we simply add equipment that utilizes a different wavelength of light on the same fiber path. Once the transmission hardware is configured, these fiber circuits act as uncompressed tie lines between each building.

The latency between locations is hardly noticeable, about 45ms, allowing the video signals to be displayed in a real-time, live presentation. However, due to service synchronization in different locations, this is not the most common choice for delivery. In fact, the services are intentionally programmed a few minutes apart in order to record and time-slip the content on demand.

A couple locations and Strategic Partners are using a compressed video stream over the Internet. As previously mentioned, these systems make use of error correction methods that allow a corrupted signal to be repaired on the receiving end. The latency between these locations is around 30s, making real-time, live presentation impossible.

Time-Slip Recording and Playback

This component is accomplished using a multi-channel video server. The device simultaneously records two or more video signals side by side and associates them to a single reference file. The dual stream content can then be cued and played in synchronization before the files even finish recording. The receiving church locations begin recording the content as it happens in real time, then simply play it when they are ready, much like watching a program on your DVR at home.