I’ve seen iPads used for storyboarding, iPads used as slates, but I’ve never seen this before… and it’s awesome!
Eric Haase, an extremely talented Cinematographer, needed to capture some intimate interviews for a recent fund-raising project called Imagine 300. Since the project had no budget, he had to think outside of the box. I spoke with Eric to learn how he managed to turn a pair of iPads into a makeshift Interrotron! What’s an Interrotron, you ask? Read on, and prepare to want a second iPad.
This is the fund-raising video we’ll be talking about:
HHH: Start off by telling us a little about the project.
ERIC HAASE: My daughter attends a public elementary school in LAUSD and due to budget cuts in California and the district, there is no funding for a lot of programs that many parents feel are essential to a child’s education. Art, music, and P.E. all receive very little or no funding at our particular school.
Luckily, involved and dedicated parents serve on the PTA and raise money every year to pay for these programs. A group of us parents at the school feel we can do a better job communicating this situation to other parents at the school, in hopes they will donate money to the PTA to fund and improve the programs. To do this, we decided to create a fund-raising video.
HHH: What did you want to include in the video?
EH: I didn’t want the video to be information or statistics- I wanted to communicate on an emotional level how kids have been impacted by the programs. I wanted the viewer to see and feel for themselves how important the programs are to the kids, rather than be told. I thought one effective, and relatively easy to produce way of doing this would be interviews with the kids and parents.
HHH: Makes sense. Walk us through the production.
EH: We had to keep things as simple as possible, since a few of us who work in the industry were calling in some favors to pull off the physical production and asking for some parents at the school to help out on the shoot day. We had to limit it to one day of shooting for these reasons and interviews seemed doable given the budget and time limitations.
I chose to shoot interviews against a black background. We shot the parent of the child first, and then separately shot the child. We also shot a bit of b-roll of the two of them together. I asked the parents and kids about their involvement in a particular program. I asked for specific stories and moments about how the child has benefited from the program- from making friends, to developing confidence, to discovering a talent or passion. I wanted real stories and honest emotion.
HHH: You achieved that emotional connection, in large part, because you had your interview subjects looking directly into the lens. Tell us about that.
EH: I’ve used Errol Morris’s interrotron style setup many times on commercials as a DP and really wanted the effect it can produce.
HHH: For any HHH readers out there who are not familiar with the interrotron, it’s absolutely brilliant. When shooting interviews, documentary director Errol Morris puts a teleprompter on his camera, but rather than projecting scrolling text on the prompter screen, he puts a live video feed of himself. That way, his interview subjects can look directly into lens, but still be looking right at Errol. He named it the interrotron. (check out the diagram to the right)
EH: Exactly. I felt like the project would benefit from the direct to camera eyeline you get with this method, along with the casual and comfortable and very real conversational feel that can be achieved. You get very interesting and telling moments where someone will look off and avoid eye contact or come back and make eye contact. I think it’s a huge difference over asking someone questions and asking them to look into the lens. When you don’t have a face where the lens is, the subject gets very self aware and presentational. Substituting a face for the lens really relaxes the subject and lends a conversational feel to the interview. As a viewer, the interview just seems more real and honest. It’s really brilliant and I wish I’d thought of the concept.
In a normal teleprompter/interrotron setup you would have one camera shooting the subject with a teleprompter mounted in front of the lens. Another camera with a teleprompter shoots the director asking questions. The feed from the camera shooting the subject is sent to the director’s prompter and the feed from the camera shooting the director is sent to the subject’s prompter. So the setup requires 2 mirrored prompters and 2 cameras.
I knew we could not afford a real interrotron setup. So, I had to figure out a cheaper (free) way I could recreate the setup.
HHH: When you told me how you solved the problem, I was totally blown away… and annoyed that I didn’t think of it first. Tell us about it.
EH: I have a friend who owns a ProPrompter HDi Pro and he said I could borrow it. So I knew I had one mirrored prompter. I considered trying to build a second one myself and use a consumer camera as the camera that would shoot me asking questions. I also didn’t have a monitor that would fit the ProPrompter since it was made specifically to be used with the iPad. I needed a way to get a video signal to an iPad. That’s when it hit me. I could just use FaceTime with two iPads.
The only issue was that the camera shooting me asking questions would have to be an iPad and would not be behind a partially silvered mirror. If it were, the iPad would be too far away to capture my face close enough for the desired effect. The iPad lens is a very wide angle lens.
I tested out the idea by placing one iPad 2 in the ProPrompter and putting that on my Canon 5D. This camera was focused on the subject and would display my face in a FaceTime chat session. Another iPad 2 photographed me. I placed that iPad in front of a 17″ monitor (which was displaying the feed from the 5D), with the camera of the iPad kind of lined up between the subject on the monitor’s eyes. I could see about half the face of the subject and sort of looked “through” the camera of the iPad onto the subject on the monitor.
I tested this with a couple test subjects and made sure it looked as though I was looking directly at them on their prompter. It was good enough that I could have a real conversation with the subject (even while only seeing half their face) and they could have a real conversation with me- seeing my whole face like a real interrotron setup.
HHH: How did your subjects react to the setup?
I was a little worried how the kids would react. I was really nervous they might get freaked out by sitting alone in a space under lights and having to look at a face on a screen. I found that the kids actually reacted very well to the setup- I gave them a little briefing on it and was amazed that most of them said “oh yeah it’s just like skype.” Almost all of them had video chatted with someone before so they were very comfortable with it.
I also think not having the physical presence of an adult nearby when they were being interviewed really opened them up. We had everything blacked out all around, along with the cameramen so all the subject could see was my face on the prompter screen.
HHH: Aside from having to work out the placement of the second iPad, did you encounter any other technical problems?
EH: The main problem was the network. The location where we had to shoot did not have a wifi network. FaceTime requires a wifi network to initiate the chat session. I thought I could use my laptop or a router and create an ad hoc network for both iPads to join. I tested this but found out that FaceTime actually requires Internet connectivity to have a chat session even if both iPads are on the same local network.
I knew I needed an Internet connection fast enough to support a FaceTime session so I borrowed someone’s Verizon 4G LTE MiFi hotspot. The small device is able to be AC powered (it had to have power to last a whole day) and creates a WiFi network that uses Verizon 4G LTE mobile broadband. I tested it at the location to make sure the signal was strong and we could achieve fast enough bandwidth speeds to maintain a long FaceTime session.
I tested it with two iPads for a few minutes and it held up. The 4G speeds are actually amazing compared to 3G. I was getting 12mbps down pretty consistently, which is well more than enough for FaceTime. I might have been able to get away with doing it all on 3G but I didn’t want to take a chance. I was also concerned that the FaceTime session would get interrupted and break the flow of the interview.
HHH: So, all the FaceTime data for all the interviews was going over a single Verizon Hotspot? The data bill must have been enormous!
EH: I did some research on the Internet and other people have successfully measured FaceTime to use about 3MB/minute, which is only 180MB/hour which would mean about 1-2 GB for our shoot day. The monthly allocation of included data for the device was 5GB.
I was really impressed on the shoot day with how the network and FaceTime worked. I had an open FaceTime session using the Verizon 4G LTE hotspot for over 8 hours. I don’t think it ever dropped one time. It was remarkable. We just left it on and connected, even between interview subjects.
I just finished cutting the video and am happy with the results of the makeshift iPad interrotron. The main drawback to this solution is the size of the interrotron screen that the subject looks at- it’s rather small since it’s limited to the size of the iPad. A normal interrotron setup would have a much larger screen for the subject to look at. The interviewer’s face would be much larger and more present for the subject to engage with. The other challenge is having the subject be able to hear you asking questions. I had a microphone that I spoke into and a speaker just underneath the prompter. I strongly recommend this for anyone doing interrotron work.
Overall, I think as a low budget solution, the iPads with FaceTime provide a workable solution.
HHH: Thanks for sharing this terrific idea. I plan to steal it immediately.
EH: Go for it. Ideas and solutions like this are meant to be shared. I’d love to see how this idea could be improved or modified.
HHH: One last question. Do you use your iPad or iPhone in other ways while on the job?
EH: I use SunSeeker on my iPhone during location scouts and on shoots. I use my iPad almost exclusively over a laptop for all my prep work- reading scripts, making camera lists, emails, etc. I use the iPad camera connection kit to immediately download scout photos from my 5D or 7D to my iPad. I use Photogene to edit them and distribute them – lately using that app to post to Picasa for any other crew, agency, or clients to view or download. It’s great to be able to do this in the scout van or at a coffee shop during the scout. I’ve also presented and reviewed scout photos to agency on the iPad during scouts. I use the iPad as a presentation tool in feature film interview meetings.
I use Quick Sale for invoicing on the iPad, Dropbox for keeping all docs like camera lists easily accessible, and Goodreader and Zen Viewer for file management on the iPad. Since I travel a lot for shoots I use Tripit on the iPhone and iPad to keep all my itineraries in one place. And of course, Yelp to find great places to eat and things to do in new cities.
HHH: Thanks for sharing your experiences with us, Eric!
To see more of Eric’s work, you can visit his website, and follow him on twitter (@ericjhaase). You can visit the Imagine 300 website to learn more about the great work they’re doing to provide kids with a well rounded education.
This idea worked for me 22 years ago with a dialogue coach that worked so hard with her clients that she convinced herself that the talents just couldn’t do the take wthout seeing her in the TelePrompTer. I used a small consumer camera to shoot the dialogue coach and ran the video directly into the TelePrompTer monitor. Using iPads and FaceTime is a really unique approach in today’s world.
Hi there, Did you ever solve the FaceTime data problem or find a suitable software router solution? It looks great!