What do I need to know to get into videography as a photographer?

TijmenDal

Active member
So,

I have said this for about two years now, but I feel like I'm finally gonna have the time and money to do this shit. I want to get into video as I found myself more-and-more thinking about video and getting creative with it and all that.

Now, of course making good video's and taking good photographs are very similar in many way's, but also very different. And then I'm mainly talking about the technological aspects of video.

I know some basic technical things about video, but not much.

A few things I know I need to school myself on (to give an indication of the stuff I do and don't know):

- Uncompressed HDMI out. Always wondering what this exactly means apart from that it's good. Uncompressed sounds like RAW to me (I know it isn't)

- 4:2:2 and 4:4:4. I imagine this is about bpp and corresponding with Bayern patterns as we know them?

- I still got a question about the RED's 12-bit RAW as Landis pointed out in this thread https://www.newschoolers.com/ns/forums/readthread/thread_id/716343/page/2/

Is that 12-bit bpp or bpc? Either way I'm confused...

What's the bitdepth of most camera's?

- I remember reading stuff about Canon sensors shooting 1080p and having lines filled in (or something like that), because it's not native to the sensor, whereas the GH2 doesn't do that as much (I think it was Evan who said this, probably a year or more ago).

I'm sure there's other slang, technical stuff and what not to learn; please point them out so I can dig into it. Not really feel like studying for midterm tomorrow...

The goal of this thread is just stuff I need to do research on myself; if you want to elaborate that'd be great, but it's mainly just mentioning topics I need to go over and understand.
 
Uncompressed HDMI: Nothing you should be concerned about unless you are using an external recorder. Basically means that the video feed is sent straight to the hdmi without being passed through the camera's compression schemes and can be recorded in a different format, thus capturing more information than you would with the high compression of the camera. How much better this is totally depends on the output and it varies from camera to camera.

Bayer pattern: the typical layout for any camera sensor. Bayer patterns usually consist of 4 photosites to one pixel: 2 green, a red, and a blue. There are two green because the color green carries luminance.

http://en.wikipedia.org/wiki/Bayer_filter

Line skipping: As you know with dslrs, they are capable of capturing a resolution much greater than 1920x1080. The "line skipping" comes when they have to sample only 1920x1080 out of a full 22mp sensor. Recording just the 1920x1080 window on the sensor would result in a crop, so they have to sample every X line to get a the resolution desired. I think it only applies to vertical lines.

This is all I can remember you asking and all info from the top of my head. Let me read your OP again haha
 
4:2:0, 4:2:2, 4:4:4 can be explained better than I ever could here:

http://en.wikipedia.org/wiki/Chroma_subsampling

basically, 4:2:0 compresses more at the sacrifice of giving up more color information than 4:2:2 which does the same as 4:4:4

I actually don't know the who bit depth/8-bit/10-bit/12-bit/14-bit RAW shabang, so threads to figure that out

and this is mostly all trivial to shooting video. As a photographer you'll get framing and composition very quickly. It's just the trick of maintaining it while adding movement to those 23.976 pictures a second ;)
 
Thanks Jamie.

I know how the Bayern pattern works; it was just that I was wondering what 4:2:2, 4:2:1, 4:4:4 had to do with it but found out that's something entirely different, namely chroma subsampling, something that's way too complicated. I just have to know that 4:2:1 sucks, 4:2:2 is ok and 4:4:4 is awesome.
 
Well, lets answer the questions you have now and move on.

Uncompressed HDMI will produce an image directly from the sensor. Now, is this raw? Not entirely, but it can be a 422/444 output and with the right recorder you can capture that. Is it relevant for normal people? No.

422 and 44 are color spaces, they are in relation to how the image creates the color you see. Read this link - http://www.dvxuser.com/articles/colorspace/

Not sure about red and what bpp bpc, but its the normal 12-bit video RAW, which I think there is only one kind.

Most cameras you will buy that are under 20k will be 8 bit. I think maybe p2 is 10 bit? not sure, but you can have high bit rates (200, 400, etc) while being 8 bit. people shit themselves over 10 bit vs 8 bit, but again does it matter for the normal person? no.

This last question is in relation to line of resolution and how a camera creates the 720/1080 image you see. Canon dslrs have 700ish lines of resolution (never an exact number listed really at least on B&H) the gh2 has 900ish lines while the fs100 has about 1000. This relates to sharpness at full screen. Now, the way that images are captured matters alot and im learning more and more about this every day. But, i know canon uses a form of line skipping to get up to 720 and 1080, which they kind of have to do to make it work. I think this happens so that the image is compressed and now some HUGE raw file from the sensor. Every camera does this different, also kind of depends on the effective MPs of the sensor too, the more the harder it is to avoid moire and aliasing. The gh2 and fs100 have higher resolutions, so at 1080p they tend to look sharper than a canon 1080p image. Thats why the gh2 looks so much better at full res, its the lines of resolution that it records. Its awesome.

That all being said, its insanely more complicated than what I just said and I probably said something wrong in there, so others please correct me, just going off the top of my head right now. Feel free to ask anymore questions!
 
4:2:0 sucks, I think 4:2:1 isn't possible, or is but isn't used.

Barry green posted about it somewhere on DVXuser but I have no idea where that post is now.

DVXuser is a great fucking site for technical video information if you want it. Lurk all the sections and you'll learn more than you ever need to.
 
Thanks a lot guys! Great help!

Especially the line skipping is super relevant. The other stuff is just for the pro's which I'm not.

The GH2 sounds great and looks great from everything I've seen, super stoked to pick one up once I get back home.

Bitrates still confuse me. It probably is different with video, that's the only explanation. 12bpc would be 36bpp, which is absolutely insane and 12bpp would be incredibly low bit depth. Oh well. It's not like I'm shooting RAW (but probably will be for 500$ in 6 years...).

Like I said, this thread is more about topics to dig into than to explain stuff I don't know. If there's any other info you think is important to shooting good video, let me know!

Thanks
 
420 h264*

420 avchd isn't THAT bad. yes, i miss 422 dvcprohd but for real, i dont mind 420. people have even tested 420 vs 422 on fs100 and there is literally NO difference. People even say for keying is okay. i dunno, some people are so hard for 422.
 
I was just using what he said/correcting the 4:2:1, I know it's not that bad.
 
Yeah bit rates are pretty straight forward, 24mbps, 28mbps, 50mbps, 100 mbps, 220 mbps, 440 mbps and 880 mbps are all popular bit rates. Is above 50 necessary? 99% of the time no.

canon h264, avchd, at all usually 24mbps. fs100 at 1080p60 is 28mbps.Thats the standard for 420. My hvx shot 70-100mbps depending on the frame rate/frame size and that was 422. Reds 12bit raw is something insane like 1gb/sec or something, its fucked.

Bitrate is just the information per second in any given frame of a video. The more info, the more flexable to grade, the easier to key, etc. The only advantage to higher bit rate for most people who want it is for grading, the other chunk is keying. Again, for the normal person, it doesn't really matter.
 
Ok, that's great info. Thanks Evan.

You said 420 vs 422 doesn't really matter; is that why the 'downgrade' from HVX to FS100 is not really a downgrade because that difference doesn't matter, but the FS100 is a superior camera?

Also, I'm confused about formats and codecs.

You're talking about AVCHD. What is that exactly? I looked it up and Wikipedia said this:

For video compression, AVCHD uses the MPEG-4 AVC/H.264 standard

But in this case MP4 is the format and H.264 is the codec, right? What is AVCHD then? Sorry if this might be a stupid question
 
color space is one piece of the camera, definitely not the most significant. the fs100 has many awesome upgrades from an hvx, more resolution, 1080p60, various display options, better sensor, more sensitive, uses any lens, etc. stuff like that.

Codecs can get confusing.

AVCHD is a compressed codec that comes in a mts file. mts works in adobe but i still encode my files to prores 422 LT, which is part of the apple prores codec family. Prores is an editing codec, some very high end cameras and recorders will record to prores 422. LT is a lower quality prores, but it is still a much higher bit rate than avchd, which can degrade the image if you pump to much data into a low end codec, but LT is fine.

h264 is another compressed codec but it normally used for exporting. Canon records to this terrible codec, i hate it. its so compressed and so delicate. its .mov form so you can edit straight from camera, but i would always recommend to encode to an editing codec.

Other codecs like canon 50mbps, dvcprohd, redraw, ect all just describe what the video file is and depending on the codec, people encode. dvcprohd came out as a weird file but when logged and transfer (encoded) it was dvcprohd .mov so it worked well to edit, but still had to encode, like most camera-raw (not RAW, just straight from camera) need to be encoded to make it easier to work with.
 
The way I learned codecs is basically this:

Your sensor absorbs a bunch of data and spits out a whole spew of info to the processors in a camera. It's the codec's duty is to take that information, and compress it as much as possible whilst retaining the most possible information for the image.

AVCHD is simply a method of compression commonly used in video, although it is wrapped differently depending on the camera and the files can have the extension .mts, .mp4, .mxf, etc. I can't tell you why because that's all I really understand about them myself haha.
 
I would worry more about learning about codecs than any of the other stuff you asked about in the thread. the rest was pretty irrelevant as far as getting into filming goes, at least until you get into the REALLY technical aspect of it. i wouldn't even have been able to answer most of your questions in the OP. i mean it's good to know, sure, but there's a lot of stuff you should be focusing on first (i.e. codecs)
 
I agree, but you really only need to know the basic of the overall codec spectrum that exists, and then learn the ones you'll be using most.

There's no need to memorize every codec out there, but it will be helpful for you to understand the ones you'll be using/exporting/converting to
 
Also, question about Premiere and FCP.

The little editing I've done was always in FCP 7. Now, more and more people seem to be switching to CS6 and I was wondering if it would be better to make the jump too. It's not that FCP 7 isn't good or anything, because it's a great program, but considering the road has ended, it might be better to get familiar with Premiere because it will have many more versions.

What do you guys think? I always used ProRes 422 (LT) for FCP; what's the best codec for Premiere?
 
you can practically edit in any format in premiere. some people still convert to something like prores or dnxhd, mainly because i think those codecs don't lose quality as easily when colour grading, etc, and also require less computer power. i always just edit straight from my camera in its AVCHD format
 
If final cut works for you, keep using it. I switched because I found cs6 to be faster, smoother, and easier. FCP 7 is like almost 5 years old now, its 32 bit, its not going to be updated, bleh. Also, running adobe means i can buy a bad ass editing desktop for < $1000.
 
That's exactly my point: it's not going to be updated. In 3 years CS 7/8/9 or whatever will be out and by that time my computer will be kick-ass probably. I'm not concerned about FCP not working right now. It's more about getting familiar with Premiere and knowing that inside out, that when the time comes to upgrade to CS 7/8/9 I at least feel very confortable with it, instead of knowing FCP well, but being totally lost in Premiere. Why not make the switch now? Or isn't that a legitimate question
 
TBH cs6 is very similar to fcp7, the people who build fcp7 (or at least someone involved) was part of the team for cs6. (heard this in the rumor mill so maybe not true) but you can set the same fcp shortcuts, you can set up your layout however you want, its awesome. Took me like 1-2 days of editing to get used to it.
 
Don't want to threadjack, but what format should I convert my .mov h.264 canon files to for editing with premiere cs6? Does adobe have a program like fcps compressor (and a format like prores 422?) that I could use to do this? My workflow has been just editing straight from the camera, but if I could convert them to another format to make it faster/less cpu power, I'd do that.

 
By far the best program to convert video files is MPEG Streamclip.

You can download it for free for both mac and pc!
 
Thanks. That looks like pretty good freeware. What format would be best? (it has a lot of different ones, avi mp4 mmv mpeg)
 
Is there a significant speed advantage of that over just h264 out the camera? I'm not sure if I should bother using it.
 
h.264 is a finalizing codec for playback. It isn't very malleable, which is why you need to convert it to an editing codec for editing. ProRes LT is the highest you should go with Canon DSLR footage because any higher and you are packing the image in a container that is too big, so to speak. Going too high (ProRes 444) will actually cause banding and damage the footage.

Lets say you want to use some cookie cutters to make fun cookie shapes. Think of h.264 as a baked cookie. The camera produces baked cookies, but you want to be able to roll them out and manipulate their form, for which dough is better suited. Imagine transcoding as magically turning a cookie back into dough. ProRes is like dough, which is baked into a cookie (h.264) once you're finished.
 
...not to mention that rolling dough takes up more space on your counter than baked cookies do, which is why you create a folder of temporary ProRes files while still keeping the original h.264 files. Once you are done editing you can archive the project and delete the ProRes versions to save disk space.

Or you can be like sheHeath and keep transcoded versions of the clips on your hard drive permanently because you're too lazy to transcode on the off-chance that you need to pull up an old project five years later, and you enjoy pissing away money on hard drives.
 
Haha yeah fair enough, Il give it a try later, It's not an exact metaphor though, because you CAN edit with h264, but can't remould biscuits. I realised h264 was an exporting codec but didn't realise there was a huge difference, and also I thought prores was for macs only
 
I'd had a quick google in the past and nobody seemed to have any evidence it was faster, and also which one to use, so I kind of forgot about it
 
If you edit h.264 natively, the computer has to do a live render while you work. So technically your machine is working harder than if you had set aside time before hand to transcode. How significant that speed difference is, I don't know. I hate Premiere so I use FCP.
 
from what I remember, the prores family of codecs doesn't come preinstalled on macs, and it is automatically installed with fcp7, but there are be other options also
 
Well I'm on a PC so I don't have fcp (or their prores codec). Which format on mpeg streamclip should I use?
 
Back
Top