00:46 | SashaC | joined the channel | |
00:46 | SashaC | Good afternoon everyone
| |
00:47 | Bertl | hey, how's going?
| |
00:50 | SashaC | It's a sweltering hot day here in Sydney. Apart from that, I'm well. I've added some more text to the 'HDMI now streaming' google doc, answering Sebastian's question regarding the Zedboard's clock speed, etc. If you have the time, could you have a look?
| |
00:52 | SashaC | How are you Bertl?
| |
00:53 | Bertl | I'm fine, thanks, working on finalizing the hdmi lut mapping
| |
00:55 | Bertl | interesting, where do the calculations come from?
| |
00:55 | Bertl | ah, I see
| |
00:56 | Bertl | well, first, the Zedboards clock rate is not limited to 166MHz or 225MHz
| |
00:56 | Bertl | the HDMI encoder used on the Zedboard (ADV7511) is limited to 225MHz
| |
00:57 | Bertl | and we currently limit the clock rate to 150 or 166MHz because that already gives 60Hz refresh at full HD
| |
00:57 | Bertl | so, in theory, we could go up to 225MHz given that we have a HDMI receiver which can handle this
| |
00:59 | Bertl | note that our HDMI frame is 2200x1126 pixel to create a 1920x1080 pixel image
| |
00:59 | Bertl | the rest is used for sync signals, back and fron porch
| |
01:00 | Bertl | and we transfer 16bit on each clock cycle, currently using 2:4:2 RGB, i.e. one red, two green and one blue per two pixels
| |
01:01 | Bertl | 2200*1126*60 = 148632000
| |
01:08 | SashaC | ok, great! I'll add this to the article :)
| |
01:25 | aombk | left the channel | |
01:50 | gwelkind | left the channel | |
01:50 | jucar | left the channel | |
01:56 | gwelkind | joined the channel | |
02:04 | jucar | joined the channel | |
03:12 | Bertl | troy_s: ping?
| |
04:05 | troy_s | left the channel | |
04:05 | rexbron_ | joined the channel | |
04:06 | troy_s | joined the channel | |
04:06 | rexbron | left the channel | |
05:01 | SashaC | left the channel | |
07:12 | S3bastian | joined the channel | |
07:13 | S3bastian | good morning
| |
07:13 | Bertl | morning!
| |
07:24 | S3bastian | left the channel | |
07:30 | S3bastian | joined the channel | |
08:38 | philippej | joined the channel | |
08:58 | SashaC | joined the channel | |
09:27 | SashaC | left the channel | |
09:42 | S3bastian | left the channel | |
09:55 | S3bastian | joined the channel | |
11:11 | Bertl | off to bed now ... have a good one everyone!
| |
11:13 | SashaC | joined the channel | |
11:14 | ThatCantBe | joined the channel | |
11:23 | S3bastian | left the channel | |
12:29 | guest | Bertl: why 1126 ? it would look to standard 1080p60 if you do it with 1125... e.g. 148.5 MHz (thats also a base SDI clock for 1.5G - which is actually 1.485G)
| |
12:51 | gwelkind | left the channel | |
12:52 | FergusL | left the channel | |
12:52 | FergusL | joined the channel | |
13:12 | philippej | left the channel | |
14:27 | gwelkind | joined the channel | |
14:47 | gwelkind | left the channel | |
15:04 | gwelkind | joined the channel | |
15:11 | jucar | left the channel | |
15:13 | troy_s | Bertl: Pong
| |
15:28 | jucar | joined the channel | |
16:11 | philippej | joined the channel | |
16:12 | philippej | left the channel | |
16:33 | philippej | joined the channel | |
16:40 | philippej | troy_s, thanks for the comments on workflow, and - I guess it's you - thanks for the explanation about typical production workflow
| |
16:40 | troy_s | philippej: I appreciate it when people actually do leg work and record things. It helps for others to pick up where things were left off often when an individual departs etc. It's a great idea.
| |
16:41 | troy_s | philippej: Yes. You can check revision histories in GDocs too. :)
| |
16:41 | troy_s | philippej: I like to give a breakdown when discussing things like this as it helps to frame / contextualize the otherwise worthless anecdotal opinions.
| |
16:41 | philippej | I saw you name, but there are also plenty of anonymous kittens and dogs and whatever :-)
| |
16:42 | philippej | I guess my role is to keep track of this, and put it in a between technical and production wording
| |
16:42 | troy_s | philippej: The greatest obstacle when discussing post production workflows is the context - and more importantly - the 'whys'. Until you slave away through a pipeline and see why things are done the 'historical' method, it's really hard to understand why.
| |
16:43 | troy_s | philippej: You seem to have a huge amount of knowledge beyond that. Useful stuffs to have such a mixed bag of folks.
| |
16:44 | troy_s | philippej: If we look to the _legion_ of "ZOMFG LET'S MAKE AN NLE!!!" things that have happened in the past, they have all failed miserably largely due to either A) underappreciating the complexities of each discrete component or B) "succeeded" and have miserably missed the needs of certain complex areas.
| |
16:44 | philippej | it was easier with film, since there was the proxy thing, edl, and as I understand it, a machine that would slice final footage together. Now the lines get blurred when you can edit r3d files directly
| |
16:44 | philippej | thanks :-)
| |
16:44 | troy_s | philippej: FCPX / Avid / Lightworks are all very capable _editors_, but some folks mistake the role of editor (NLE) in a more nuanced / complex pipeline.
| |
16:45 | troy_s | philippej: The machine was a skilled technician with white gloves actually :)
| |
16:45 | troy_s | philippej: Negative cutter in the credits.
| |
16:45 | philippej | (easier to understand I mean of course)
| |
16:45 | troy_s | philippej: But the _reasons_ _still_ hold true.
| |
16:46 | troy_s | philippej: That is, the reasons for not dealing with your archival footage (in the past it was "Don't destroy the negative!" but now it is "Make the data actually editable in realtime!" The context shifted, but the process remains almost identical.
| |
16:46 | troy_s | It remains the only forward-looking process.
| |
16:46 | philippej | agreed
| |
16:47 | troy_s | (And a process that permits cross boundary interaction. For example, tracking a falling glass and rendering it in a scene _cannot_ work in a kitchen-sink styled application. At the very least, even if an app aspired to be an NLE / 3D renderer / Tracker / Compositor, it would likely suffer on at least one of those areas for integrity of data / qualitative issues.
| |
16:48 | troy_s | I find that when discussing things like this, people often tend to horribly oversimplify the process...
| |
16:48 | troy_s | The idea of creating an editor that will spit out final frames is the path of pure ludicrous folly if you start trying to visualize a project.
| |
16:49 | troy_s | (Even titling can be quite a lovely experience to engage creatively, and scrolling white text across an image will have some shortcomings if it is attempted in a kitchen-sink app, let alone tracking and using shadows and 3D elements (ala Fincher styled credits for example))
| |
16:50 | troy_s | philippej: It all comes down to buyng fully into the offline system with interchanges or rejecting it. And if one rejects it, they will have a helluva time accomodating the nuances of each unique need.
| |
16:51 | troy_s | philippej: Read the LAST paragraph of this article and you can see where knowledge cripples understanding http://fstoppers.com/black-magic-releases-raw-files-of-cinema-camera-for-you-to-color-grade
| |
16:51 | philippej | it all depends on the scale of production. i guess most people who don't get it are looking at a one man band short. Where a feature film involves so many different people with different jobs, tools and file formats
| |
16:52 | troy_s | (He's missing a key understanding of a post-production pipeline which makes it rather... challenging to understand how the pieces fit together. :) )
| |
16:52 | troy_s | philippej: Even one-person projects require it.
| |
16:52 | troy_s | philippej: I'd show you a few one-person projects that do that. And you _certainly_ require it on anything even remotely larger (like an _extremely_ small crew project)
| |
16:53 | troy_s | philippej: The exception being newsy styled things. Although even then, such as in the case of creative documentaries, often visual effects lend a huge bit of 'connective tissue' (think visual effect motif through a short piece) that cannot be ignored.
| |
16:54 | troy_s | philippej: The difference with one-person gigs is that the one-person is peforming _many_ roles. It doesn't change the context of the needs (which again, is a common refrain I hear)
| |
16:54 | philippej | agreed
| |
17:02 | troy_s | Anonyous grizzly?
| |
17:02 | troy_s | LOL
| |
17:03 | gwelkind | left the channel | |
17:37 | Bertl | morning everyone!
| |
17:39 | philippej | morning !
| |
17:39 | mars_ | morning Bertl
| |
17:41 | troy_s | Bertl: Pong
| |
17:42 | Bertl | hey troy_s, I was wondering, what would be the effect of switching color correction matrix and gamma lut?
| |
17:43 | troy_s | Bad
| |
17:43 | troy_s | :)
| |
17:43 | Bertl | i.e. first do the gamma lut (let's assume it is the same for each channel) and then the/a color correction matrix as you know it
| |
17:43 | troy_s | The issue is that the color can only be transformed on linearized data.
| |
17:43 | troy_s | Otherwise the luminance is off and you end up in strangeville.
| |
17:44 | troy_s | Color always must be transformed in the following order: 1) Invert any non-linearity to get to a linear ratio system 2) transform to linear XYZ 3) transform back to destination RGB 4) apply any non-linearity curves where needed (if EXR for example, don't bother)
| |
17:45 | troy_s | where 2 and 3 are concat'd into a single matrix (including any Bradford if needed in there)
| |
17:51 | philippej | crazy suggestion, is it possible to merge 1 & 2-3 (at the hardware processing layer)?
| |
17:53 | Bertl | please elaborate
| |
17:54 | philippej | step 1 : is it also done with a matrix ?
| |
17:55 | Bertl | I guess usually with a LUT, but in our case, we are already linear
| |
17:57 | philippej | I don't know if it's simpler, but doing all the processing at once might probably be easier for you? So I was suggesting to merge all the transforms to it in a single pass (if it can be done without side effects)
| |
17:58 | Bertl | yes, that is what I plan/planned to do with the 3x3 LUT adder
| |
17:59 | Bertl | but I'm not so sure it is the simplest way to get something working right now
| |
17:59 | philippej | then please discard my intervention :-) Let's not make your work harder
| |
18:23 | troy_s | Bertl: A LUT only speeds up the matrix, and then you have to also worry about interpolation.
| |
18:23 | troy_s | Bertl: Better, a matrix is reversible.
| |
18:23 | troy_s | You could of course combine both the matrix and the 1D transfer into a 3D LUT.
| |
18:24 | troy_s | Bertl: What's your issue?
| |
18:24 | troy_s | Bertl: Also note that I believe Gabe's work only applied the sRGB transfer curve, which isn't obviously the 709 transfer. You'd need to get the correct 709 transfer on the end.
| |
18:27 | Bertl | I'd like to use the CSC matrix available in the HDMI encoder for a quick and dirty color correction
| |
18:27 | Bertl | but there is no mechanism to do any kind of gamma correction after that
| |
18:28 | Bertl | the CSC is baiscally a 3x3 matrix plus 3 offsets, all completely adjustable
| |
18:29 | guest | and the "gamma" is derived from signal type?
| |
18:30 | guest | i think you would lose some amount of precision - CSC on 8bit source would be less than 8bit on the end (in terms of quantization)
| |
18:31 | Bertl | the CSC is 13bit signed plus 2 bit adjustment, and I can probably feed it with 12bit input
| |
18:32 | Bertl | so given that almost all HDMI recorder currently available do 10bit max the precision should be fine
| |
18:33 | guest | i would worry about that assumed de-gamma on the input - csc does 13bit after it is linearized, right
| |
18:34 | guest | maybe you can force a signal standard there
| |
18:34 | Bertl | I would assume so, the documentation for the encoder is not completely clear to me
| |
18:36 | troy_s | Bertl: Sounds OK then?
| |
18:36 | troy_s | Bertl: The 3 offsets are singular though?
| |
18:36 | troy_s | Bertl: As in you can't get to a correct LUT likely for the transfer curve?
| |
18:37 | troy_s | (not sure what the term offset will do to the matrix other than ... well offset it )
| |
18:37 | troy_s | Bertl: Shouldn't it be easy to generate a limited 1D LUT for 709 and apply it on the tail end after the matrix?
| |
18:37 | Bertl | each channel in the CSC basically dos X*A1 + Y*A2 + Z*A3 + A4
| |
18:38 | Bertl | *does
| |
18:38 | Bertl | where X, Y and Z are the input components (R/G/B, Y/Cr/Cb)
| |
18:39 | guest | A4 is for correction of the broadcast spec data (limited range for analog/sync compatibility)
| |
18:39 | troy_s | guest: That should be a scale, not a shift.
| |
18:40 | guest | there should be also an X/Y/Z offset
| |
18:40 | guest | it is shift - you move the zero up to 16
| |
18:40 | troy_s | And the high? :)
| |
18:40 | guest | defined by scale in the matrix
| |
18:40 | troy_s | You realize that the luma /chroma 16-235/240 is a scale yes?
| |
18:40 | troy_s | So overscale it then offset?
| |
18:40 | guest | underscale and then offset
| |
18:41 | troy_s | But it can't do it non-unformly and it needs to be for the difference between the 709 Y' and Cb' Cr'.
| |
18:41 | guest | there should be some xyz offset before the matrix
| |
18:41 | guest | a matrix is never non-uniform :)
| |
18:42 | troy_s | Not sure how to do that with a simple scale and offset, but I'm sure it is possible (the 16-235 for example would scale to 16-271 then offset -16?)
| |
18:43 | troy_s | guest: Matrices are all uniform I believe, as in linear.
| |
18:43 | troy_s | guest: Unless I am mistaking something. By non uniform I mean that they are two part or special to the value.s
| |
18:43 | troy_s | (In this case, the Y' has to be treated differently than the Cb' Cr'.)
| |
18:44 | guest | the de-gamma/gamma should be luted there on in/out, hardcoded to whatever formats the chip supports (601 / 709)
| |
18:44 | troy_s | And that is assuming we worry about broadcast YCbCr 709 and not simply dumping full range 709 to the HDMI display.
| |
18:44 | troy_s | Bertl: That brings up an interesting point - is the HDMI routing to a display that requires broadcast 709?
| |
18:45 | troy_s | Bertl: Because if it does, then the correct transform to YCbCr would be needed (which adds another bloody layer of complexity going from RGB to Y'Cb'Cr'.)
| |
18:45 | guest | that would mean you go by YCbCr ? (is there a 709 RGB?)
| |
18:45 | troy_s | guest: Of course there is.
| |
18:45 | guest | ok
| |
18:45 | troy_s | Or at least as far as I understand your question.
| |
18:46 | troy_s | I believe 709 by the IEC spec (have to look in my folder) only specifies the scaling etc.
| |
18:46 | troy_s | Not necessarily the model.
| |
18:46 | troy_s | (Either R'G'B' or Y'Cb'Cr')
| |
18:46 | troy_s | Good question though.
| |
18:58 | troy_s | guest: Looking now.
| |
18:58 | troy_s | guest: Hold tight
| |
19:00 | troy_s | guest: According to page 3, there's no specifcation on model and both R'G'B' (E'R, E'G, E'B) and Y'Cb'Cr' (E'Y, E'Cb, E'Cr) are outlined
| |
19:00 | troy_s | guest: So the only thing is the scaling as applied to the two different formats (both of which are discussed)
| |
19:00 | guest | the ' means linear?
| |
19:01 | troy_s | Curved
| |
19:01 | troy_s | Y'Cb'Cr' means the curve is baked in.
| |
19:01 | guest | the final product, understand
| |
19:01 | troy_s | (and the curve for a camera is different than viewing.)
| |
19:01 | troy_s | (Which counters my point above. Viewing suggests BT.1886 which glancing at the transfer looks like sRGB)
| |
19:02 | troy_s | (Assuming a correctly linearized 709)
| |
19:05 | troy_s | I lied
| |
19:05 | troy_s | 2.4
| |
19:06 | troy_s | Pure gamma.
| |
19:07 | SashaC | left the channel | |
19:07 | Bertl | okay, so 12bit 2:4:2 RGB (or YCrCb if you like) is working on the ADV7511 \o/
| |
19:08 | Bertl | we can use the CSC if we like to (e.g. to convert from RGB to YCrCb or the other way round)
| |
19:08 | troy_s | Bertl: It's not Y'Cb'Cr' - it's RGB isn't it?
| |
19:08 | troy_s | (What is coming down the chose.)
| |
19:08 | troy_s | hose
| |
19:09 | troy_s | Bertl: It's just a transformed RGB from the linear correct?
| |
19:09 | troy_s | And if it is, the only real transform then is A) convert to 709 primaries (via the Gabe matrix as it is sRGB)
| |
19:09 | guest | depends where the hose is :)
| |
19:09 | Bertl | currently I'm outputing memory data, so it is whatever I write there
| |
19:09 | troy_s | then convert to 2.4 gamma (as per BT.1886)
| |
19:09 | Bertl | it can be interpreted as RGB or YCrCb from the encoder
| |
19:10 | Bertl | it can also be transformed via the CSC to change the color space
| |
19:11 | Bertl | so, assuming we have 12bit linear input, we probably could do the required transform in the CSC, as the gamma correction should be done in the receiver
| |
19:11 | Bertl | i.e. in the output device
| |
19:12 | troy_s | Bertl: How would it do that?
| |
19:12 | guest | can you bypass the gamma unit on the encoder's input?
| |
19:12 | troy_s | Bertl: In the output device?
| |
19:13 | guest | i have a feeling that the csc is usable for mixing bits (rgb to ycbcr) but is not a wise choice to use it for color modification - think of the fake brightness/contrast in monitors
| |
19:15 | Bertl | well, if it does YCrCb to RGB (and vice versa) it should do the job for color correcting our input as well, no?
| |
19:19 | Bertl | troy_s: from the metadata being broadcast via HDMI? i.e. interpreting the color space used for example?
| |
19:20 | Bertl | we could even send xvYCC data
| |
19:29 | troy_s | Bertl: Explain?
| |
19:29 | troy_s | Bertl: You'd still need to do all the same things to the RGB data from the sensor to get it into a suitable Y'Cb'Cr' transformed destination.
| |
19:35 | Bertl | and why wouldn't that work with the CSC?
| |
19:37 | troy_s | Bertl: No clue what the CSC is or how it works.
| |
19:38 | troy_s | Bertl: All I can say is that no matter what there's a matrix and a 1D LUT in there before.
| |
19:39 | Bertl | before what?
| |
19:39 | troy_s | Bertl: Before dumping to an output.
| |
19:42 | Bertl | but given that the HDMI encoder takes RGB input and can apply a 3x3 matrix (including offset) before sending out the data as RGB or YCrCb, why shouldn't we be able to use that (for now, as a hack)
| |
19:42 | troy_s | Bertl: I don't see why not. Only issue is the transfer curve.
| |
19:43 | Bertl | note: I'm not planning to do that in the future, there is no point as we won't have the HDMI encoder anyway
| |
19:44 | troy_s | Bertl: Quick and dirty is quick and dirty. Do what works, and is easiest at this point regarding time invested. My only thought is to make sure that whatever the level (422) of the output that it displays as closely as possible to the specification.
| |
19:50 | troy_s | Bertl: And better, using BT.1886 means that it is a pretty simple straight gamma. Although in field it would likely need to be lower as ambient light woes.
| |
20:20 | Bertl | btw, do we have a matrix for our sensor yet?
| |
20:20 | Bertl | i.e. actual kind-of-calibrated values?
| |
20:26 | philippej | since I wanted to know how argyll work, a few weeks ago I tried to make a profile from your "perfect" png, if it is useful it is here : http://public.philippejadin.be/it8-tungsten-01-12ms-perfect%20qm%20r.icm
| |
20:32 | Bertl | looks interesting, can you convert it to the values for the matrix?
| |
20:33 | Bertl | (interesting because the blue curve has a knee)
| |
20:34 | philippej | I'm not sure I know how all this works, so a probably not the best candidate; From here : http://www.color.org/opensource.xalter I see there is atool to convert to xml
| |
20:38 | philippej | and this lib : http://sampleicc.sourceforge.net/
| |
20:44 | troy_s | Bertl: Gabe has some in his software to dump a DNG
| |
20:44 | troy_s | Bertl: Not sure what version they are.
| |
20:45 | philippej | would this help : http://www.mathworks.nl/help/images/ref/iccread.html ?
| |
20:45 | troy_s | Bertl: And judging from my glance at philippej's, it looks like a 3400 white balanced
| |
20:46 | troy_s | Bertl: You can actually strip the marix right out of an ICC.
| |
20:46 | philippej | the chart was tungsteen indeed
| |
20:46 | philippej | it's orangish
| |
20:47 | troy_s | So there'd need to be a Bradford in there to get to D65
| |
20:48 | troy_s | Bertl: iccdump should do it.
| |
20:49 | philippej | argyll might provide some more useful text files, but it doesn't solve the problem of whitebalance iiuc
| |
20:49 | troy_s | philippej: It does
| |
20:49 | troy_s | philippej: It's simply created a straight through. So if we know the white balance temp (we do) we can use a Bradford to get to D65
| |
20:49 | troy_s | philippej: You or Bertl probably know the math for the matrix transform.
| 20:50 | philippej | googles bradbord
|
20:50 | Bertl | iccdump gives me two identity matrices :)
| |
20:50 | troy_s | Bertl: Huh?
| |
20:50 | troy_s | Bertl: Then his profile isn't a matrix
| |
20:50 | troy_s | Bertl: Probably a shaper or something
| |
20:51 | philippej | my original disclaimer : I just wanted to test the tool
| |
20:51 | Bertl | http://pastebin.com/jhAxJN7G
| |
20:51 | philippej | I had to change some parameters to have a lower error level
| |
20:51 | Bertl | so it seems to use LUTs
| |
20:52 | philippej | I think you want the first one, a2b and not b2a
| |
20:52 | Bertl | no, I actually want a matrix :)
| |
20:52 | philippej | (wild guess again)
| |
20:53 | troy_s | Bertl: Grr. Let me find Gabe's software
| |
20:56 | philippej | Bertl, and this one : http://public.philippejadin.be/it8-tungsten-01-12ms-perfect%20bis.icm ?
| |
20:56 | troy_s | Bertl: https://github.com/apertus-open-source-cinema/alpha-software/tree/master/misc_scripts_and_utilities/dng
| |
20:57 | troy_s | Bertl: https://github.com/apertus-open-source-cinema/alpha-software/blob/master/misc_scripts_and_utilities/dng/src/utils/CMV12000_sRGB.cpp
| |
20:57 | troy_s | Not sure what revision of those damn things are
| |
20:57 | troy_s | Bertl: I apologize for not working on them more. I've been working on this though...
| |
21:00 | troy_s | http://www.pasteall.org/pic/show.php?id=66765
| |
21:03 | troy_s | Bertl: Can you post an image with that transform matrix of Gabe's please?
| |
21:03 | troy_s | float camToLinSRGB[3][3] = {{1.64599803,-0.24326936, 0.26884016},
| |
21:03 | troy_s | {0.01493516, 1.87764304,-0.61132345},
| |
21:03 | troy_s | {0.13303648, -0.14394397, 2.21110193}};
| |
21:03 | philippej | gotta go, good luck !
| |
21:03 | troy_s | (I'm not sure that he Bradforded that to get D65, so I'm not sure it is sRGB or "sRGB at the white balance of the photo")
| |
21:04 | philippej | left the channel | |
21:05 | Bertl | okay, np
| |
21:05 | troy_s | Ok off for some shopping. Shall return soon enough. Hit me up with a link if you get it Bert.
| |
22:02 | troy_s | Gosh darnit.
| |
22:02 | troy_s | Bertl: Why on earth did we get all those over exposed images with NO DAMN CHART
| |
22:02 | troy_s | Bertl: How can I profile with a decent exposed image with no chart??
| |
22:02 | troy_s | ERF
| |
22:05 | Bertl | in my raw folder, every image with _c_ has a chart
| |
22:06 | Bertl | _c or _c1, _c2
| |
22:06 | troy_s | I'm in it.
| |
22:06 | troy_s | The ONLY one with a recent datestamp
| |
22:06 | Bertl | the identically named images without _c have same settings, just no chart
| |
22:07 | Bertl | IIRC, sebastian has done a number of images with chart as well
| |
22:11 | troy_s | left the channel | |
22:11 | rexbron_ | left the channel |