Current Server Time: 09:02 (Central Europe)

#apertus IRC Channel Logs

2015/12/06

Timezone: UTC


23:06
arpu
left the channel
00:17
pozitron
joined the channel
02:36
intracube
troy_s: what did you mean "the linear portion of the sensor doesn't carry through the full range of data"
02:37
intracube
that some of the luminance values coming of the sensor will not be linear?
02:42
troy_s
intracube: Yes.
02:43
troy_s
intracube: Especially those near the saturated end (the full well region) as well as those near the bottom (near the noise floor)
02:43
troy_s
intracube: So as you can imagine, if the response is nonlinear and you are using a standard matrix to arrive at the correct colourspace values, the two nonlinear regions will skew. This is largely the reason you see such nasty ass highlights.
02:45
troy_s
(In a majority of cameras. For example, if you have a DSLR kicking around, pop a photo of skin near a hot lamp. The image will often look almost posterized because the internal camera matrix is not holding up near the nonlinear roof. Worse, the moment that a colour axis deviates from the learned red / blue, our perception will flag it immediately.)
02:46
troy_s
(Hence why tungsten / daylight balanced axes are very easy to accept even when there is deviance from the achromatic adjusted white. However, the very same amount of perceptual deviation along a greenish / magenta will jump out like a frog from a cake.)
02:46
troy_s
intracube: Senese?
02:46
troy_s
Sense even?
02:47
intracube
troy_s: sorry, I wasn't following channel
02:48
intracube
my notification seems to be broken, eh, oh well
02:48
intracube
troy_s: yep, got it
02:48
intracube
troy_s: "almost posterized because the internal camera matrix is not holding up near the nonlinear roof"
02:49
intracube
hmm, I was under the impression you'd get that effect even with a theoretically perfect sensor (on an object that isn't white)
02:49
intracube
as the three channels would max out and clip at different levels
02:49
intracube
effect is often visible on sunrises, or as you say peoples faces
02:50
intracube
but if you have a 'grey' object that happens to match the sensor white balance (if that makes sense)
02:50
intracube
all three channels would clip to white at the same point (in a theoretically perfect sensor)
02:50
troy_s
intracube: Technically that's sort of the magic of the way that the view transform should work.
02:51
troy_s
intracube: You've seen some handling for tonemapping and some of the implementations use a 'preserve saturation' which is actually somewhat counter to the aesthetic learned through film.
02:51
intracube
doing anything else (massage the data to roll off more gracefully) actually shifts away from accuracy to make it look more pleasing
02:51
troy_s
intracube: If you want to see whack, simply try blender.
02:51
intracube
or not?
02:51
troy_s
intracube: You actually sort of want a 3D LUT that converges based upon some threshold.
02:51
intracube
troy_s: ah yes, for tone-mapping and such, yep
02:52
troy_s
intracube: So say red gets above 95% or whatever you want, you actually want the other channels to come up to desaturate toward the neutral axis.
02:52
troy_s
intracube: That's the filmic response.
02:52
troy_s
intracube: Because the magic of film was that light nuked the three layers and you actually desaturated on your way to white.
02:52
intracube
troy_s: yep. in my mind that's something to sort out in post as part of the grading process
02:52
troy_s
intracube: If you think of what happens in Blender (or any default agnostic hard clip LUT)
02:52
troy_s
intracube: Saturated primary will be... saturated primary!
02:53
intracube
(and avoid overexposing when shooting)
02:53
troy_s
intracube: When in reality you want that fully saturated max intensity channel to converge to white.
02:53
intracube
troy_s: so you're suggesting the Beta camera should be set up to roll highlights up to converge to 'white'
02:53
troy_s
I know it sounds rather odd, but in terms of the display referred transform, it is actually the aesthetic that makes sense on a learned / emergent aesthetic basis; we learned that aesthetic from about a hundred years of cinema.
02:53
troy_s
intracube: No.
02:53
intracube
and do that 'in camera'?
02:54
intracube
so a post production procecss?
02:54
troy_s
intracube: And that's a critical point that the dumbfsckery misses when they get up in arms and ZOMFG about highlight "recovery" (junk term)
02:54
intracube
*process
02:54
troy_s
intracube: You want data from the camera, in typical post production pipes, to be 100% colour accurate. NOT deliver a 'good picture' or some other canned idiot response.
02:54
intracube
exactly what I thought :)
02:54
intracube
yep
02:54
troy_s
intracube: You want accuracy. Because in post, say you have three cameras, you want to match footage etc. for CGI or even simple grading.
02:55
troy_s
The last thing you want is a crapload of footage that each needs to be manually "bullshitted" to match.
02:55
intracube
absolutely
02:55
troy_s
(you want proper colour transforms the whole way in my opinion, not some junk grader in a room massaging curves.)
02:55
troy_s
You can ask anyone that worked on the Hobbit.
02:55
troy_s
They used el-crappo R3Ds and of course R3D doesn't give colorimetry
02:55
troy_s
So they had to accurately arrive at it (via math)
02:56
troy_s
But there is no shortage of dumbfsckery out there that will try to voice their opinion on how colour should work. I'd encourage ALWAYS thinking about the context. Apertus is first and foremost a motion picture camera for cinema.
02:56
troy_s
That means these sorts of issues are paramount.
02:57
troy_s
I don't care if someone thinks that some made-up crappy demosaicking technique 'looks better' or whatever
02:57
troy_s
Same goes for some default massaged imaging LUT.
02:57
troy_s
intracube: All of that is ignorance as to how the hell you end up with a motion picture (avant-garde or mainstream narrative) and the techniques to get there.
02:57
intracube
thinks he remembers seeing a graph showing the RGB response of the CMOSIS chip
02:58
intracube
mostly linear until the highlights where one of the channels drooped quite a bit
02:58
troy_s
intracube: The problem with graphs and charts is that they don't really give you the idea of response in a random context with regard to the standard observer.
02:58
intracube
or there was an odd kink, for a better word
02:58
troy_s
intracube: It's crazy. Don't forget, it is _not a spectral sensor_. That is, light sources, refractions, reflections will change that response.
02:59
troy_s
intracube: Yes. Nonlinear. That means matrix transforms fall apart. Rat piss yellow or El Purple Mundo.
02:59
troy_s
(Mostly Rat Piss Yellow off of foreheads.)
02:59
troy_s
(Which of course deviates off of the tungsten / daylight axis enough to make everyone spot it immediately.)
03:00
intracube
so many graders seem to love the look tho ;P
03:00
troy_s
It's tricky though, as many cameras need to figure out how to deal with that noise. Canons I believe add an offset.
03:00
troy_s
After a transform, you'll end up with negatives in that noise floor, and clamping actually increases the stop of red.
03:00
troy_s
(Because you are removing the negatives which are expressing the full rest of that colour.
03:03
intracube
hmm, yep
03:53
Bertl
off to bed now ... have a good one everyone!
03:53
Bertl
changed nick to: Bertl_zZ
09:55
pozitron
left the channel
09:56
arpu
joined the channel
12:43
Ladislas
left the channel
12:43
mgielda|away
left the channel
12:43
Ladislas
joined the channel
12:45
mgielda|away
joined the channel
12:52
danieel
left the channel
13:02
mithro
left the channel
13:04
mithro
joined the channel
13:11
Bertl_zZ
changed nick to: Bertl
13:11
Bertl
morning folks!
15:07
troy_s
intracube: https://www.reddit.com/r/photography/comments/3q8q7o/purple_noise_in_shadows/cwds00b
15:25
Bertl
off for now ... bbl
15:25
Bertl
changed nick to: Bertl_oO
16:11
alexML
troy_s: interesting, but I'm afraid rnclark's explanation about Canon offset is wrong (purple noise in Canons is not because of this offset, it's because the read noise is higher)
16:11
alexML
see the linked thread, at https://www.reddit.com/r/photography/comments/3q4tnz/how_to_correctly_push_5_stops_with_canon/cwcbyb5
16:12
alexML
there, he says the 2048 offset is extreme (9 stops) and Canon needs to lower it
16:12
troy_s
I am also reasonably sure that both explanations aren't quite as accurate.
16:12
alexML
I say the 2048 offset is quite reasonable and the only loss is some 0.2 stops
16:13
troy_s
The bottom line is that even if the calc were in float, the results would still derive negatives due to the transform.
16:13
alexML
ya
16:14
troy_s
Offset isn't an issue.
16:14
troy_s
Because given linearized sources, you can always offset, and offset back.
16:14
alexML
yes, offset actually helps (if you do noise reduction or frame stacking, you really need that negative data)
16:15
alexML
btw, I've noticed that, besides white balance, applying the DNG color matrix increases the shadow noise a lot
16:17
troy_s
Matrices suck
16:17
troy_s
Full stop.
16:17
troy_s
They work wonders on idealized colour spaces and transforms, but for real world nonlinear things, they simply aren't effective.
16:18
troy_s
(Ignoring DNG which is neither here nor there nor relevant.)
16:18
alexML
agree, and I've noticed they fail miserably on highly saturated colors
16:18
troy_s
Of course. Perfectly sensible.
16:19
alexML
and that happens with perfectly linear sensor data
16:19
troy_s
Hence why for Apertus I would like to see a feathered 3D LUT so the colorimetry holds the whole range.
16:19
troy_s
Sensor data isn't perfectly linear.
16:19
troy_s
That is the issue
16:19
troy_s
None of them are.
16:19
alexML
yes, but can be linearized
16:20
troy_s
Not really
16:20
alexML
why?
16:21
troy_s
That is a hardware thing. I think Bert could come up with a reasonable test to try and linearize the nonlinear regions, but I significantly doubt the accuracy of it.
16:21
troy_s
Likely more prudent to simply profile the sensor as is, at various brackets, and use a 3D LUT to generate the accurate results.
16:22
troy_s
(As in a merged 3D LUT that properly models the nonlinear regions.)
16:23
troy_s
We can linearized colour space transfer curves to display linear no problem, but sensors might exhibit erratic bits.
16:23
hozer
joined the channel
16:24
alexML
what exactly do you mean by "erratic bits" ?
16:28
troy_s
I am just skeptical of a technique that would attempt to attain a purely radiometric linearization of sensor data.
16:28
troy_s
I am not saying it is impossible, at all. Just ... Skeptical.
16:29
hozer
left the channel
16:30
troy_s
Ultimately the goal is to dump to an EXR / DPX with scene referred data accurately represented calorimetrically.
16:30
troy_s
Colorimetrically.
16:30
troy_s
Thank you spell check.
16:32
hozer
joined the channel
16:35
troy_s
alexML: Bottom line is some IT8 tests. I am also not entirely sure how the CA transforms will impact the data. Might require an offset - transform - offset approach.
16:38
troy_s
(Or better, a light box and a spectro)
16:52
irieger
I will write something about the camera characterisation toolset we build at my university and how it works. Would love to have a light box with dialable mono wavelengths thought ...
17:09
troy_s
irieger: I think a Jeti with a good light box would go a long, long way.
17:12
troy_s
Interestingly the Alexa to ACES matrices are per chromatic adaptation
17:12
troy_s
https://github.com/ampas/aces-dev/blob/master/transforms/ctl/idt/vendorSupplied/arri/alexa/AlexaParameters-2013-Nov-13/Alexa_aces_matrix.txt
17:13
troy_s
And per EI of course
17:13
troy_s
https://github.com/ampas/aces-dev/tree/master/transforms/ctl/idt/vendorSupplied/arri/alexa/v3/EI800
17:19
John_K
left the channel
17:20
alexML
troy_s: in Alexa_aces_matrix.txt, CCT is color temperature?
17:20
troy_s
Yes I believe so.
17:21
alexML
DNG is similar, but with only two matrices
17:21
irieger
troy_s: Does the EI really have influence on the matrix? Thought it was only the linearization that is influenced by the EI
17:21
troy_s
alexML: DNG is a joke.
17:22
John_K
joined the channel
17:22
troy_s
alexML: But much like CMYK, the variables all sum up to unique transforms per set of contexts. So for the Alexa it is EI + [ND | no ND] + CA
17:22
troy_s
That add up to a unique context per.
17:22
troy_s
irieger: Of course it does.
17:22
alexML
irieger: the EI does not appear to have any influence on the matrix (it's just a gain)
17:23
troy_s
The gain ultimately shifts the matrix so it "does" I suppose is the best phrase.
17:24
troy_s
Although the maker script lists one gain.
17:24
troy_s
So it might be deeper.
17:25
troy_s
Wow. That script was written by Alex Forsythe
17:25
troy_s
Bonkers.
17:25
alexML
to me, the code linked above for Alexa looks no different than DNG, essentially (just a little better fine-tuned)
17:26
troy_s
alexML: Ask yourself how many vendors actually use DNG.
17:26
troy_s
Answer: Zero.
17:26
troy_s
For good reason.
17:27
troy_s
It isn't terribly well suited for cinema.
17:27
troy_s
Adding overhead that is really not all that useful.
17:27
troy_s
Easier to simply go from low level representation direct to DPX or EXR.
17:27
troy_s
Which will let you commence work.
17:28
troy_s
I appreciate the idea behind it, but it strikes me as a format that someone thought might have been suitable for work without really looking at the contexts.
17:28
troy_s
DPX already has enough metadata for colorimetry if you desperately want it, or even EXR.
17:30
troy_s
I just don't get the obsession with DNG. At all. Someone is welcome to offer another vantage.
17:30
irieger
troy_s: Quiet a few people seem to use it with some Sony Cameras and the Convergent Design Odysee 7Q rekorder. And Blackmagic
17:31
troy_s
irieger: Alexa: No. Sony: No. Even nasty ass R3D: No.
17:32
troy_s
irieger: The essence of my point still stands however; it is an intermediate and as intermediates stand, there exists another with longer tails.
17:33
troy_s
irieger: I welcome anyone that has ever done post work to try and express why DNG would be useful beyond say, DPX.
17:33
troy_s
It is just a silly discussion in my eyes.
17:33
troy_s
Camera records to some native format. Go to non-native for post such as EXR and start work.
17:34
irieger
Sure. I know, the big three have their own formats. But for what I have experience with DNG works as good or bad as the other vendor formats in Resolve for example. Only thing is that only a small number of color spaces is supported by Blackmagic as the target after debayering
17:35
troy_s
irieger: Resolve is for grading.
17:35
troy_s
irieger: Bear in mind.
17:35
troy_s
And now, with BM, think about if it is an agnostic world view
17:35
irieger
troy_s: And if it works good, why should I go to exr first? Compressed DNG is much smaller than EXR (with PIZ or so) due to having only a bayer image.
17:36
irieger
Everything outside EXR, sure.
17:36
troy_s
So two facets: A company with a vested interest DNG writes software _and_ the fact that Resolve is for grading, which is the absolute last possible thing you do in a post production pipe.
17:36
troy_s
EXR is the defacto sole format intra pipeline.
17:36
troy_s
It is scene linear and your data would go there once, stay there forever until the very tail
17:37
troy_s
EXRs also can be compressed as can DPXs.
17:37
troy_s
(Not that storage is ever much of an issue)
17:37
troy_s
irieger: Fair?
17:37
irieger
troy_s: Sure. But in most workflows they may go through a grading system before to get everything through one pipeline and one standardized process to exr before later processing ;-)
17:37
troy_s
irieger: Never.
17:38
troy_s
irieger: See that ACES repo? Tell me where it goes through a grading system?
17:38
troy_s
;)
17:39
troy_s
Most folks don't get that grading is the last possible concern. It really is. It is a tremendous one, but in terms of pipelines, it is well after editing and the rest of post.
17:40
irieger
I listened to some post pro folk talking where they went through the grading system where they did the EXR convert and also delivered a look preview transform of some kind for the shot. And have you looked at the baselight concept? It's killer
17:41
irieger
and the ACES repo is just the toolset to use in the system you like ;-)
17:41
troy_s
irieger: It gives you a pretty good idea where some world views meet reality.
17:41
troy_s
Baselight and Resolve stations are incredible. But again, grading.
17:42
troy_s
(As is a decent Lustre station)
17:42
troy_s
Typical pipe is something along the lines of:
17:43
troy_s
Preproduction, Shooting Production, Editing (spit out text files), Post production (EXR), Finishing (grading from whatever entry point)
17:44
troy_s
The ACES pipe would help you get the raw archival frames into a uniform colour space, with ACEScg and ACEScc (linear and nonlinear compatible) references for CGI and Color Correcting respectively.
17:45
troy_s
(ACES is a larger volume than ACEScg / cc which share identical primaries IIRC, but are designed for tristimulus rendering due to nuances of our colour models and the way the math works.)
17:45
irieger
I wouldn't save anything in ACEScc ... Just stay linear for the files. But Grading often starts before or while shooting.
17:46
troy_s
No. It. Doesn't.
17:46
troy_s
One lights for sure
17:46
troy_s
As in "we need something for editorial / dailies"
17:46
troy_s
A one light / quickie grade can even be used as an entry point for a final grade, but that isn't touching the frame data.
17:46
troy_s
At all.
17:46
troy_s
It's a look.
17:47
troy_s
Your reference would remain raw.
17:47
irieger
Did I say anything like toughing the data?
17:47
irieger
nope
17:47
troy_s
I was trying to say that the idea of a one light has nothing to do with post work
17:48
troy_s
And specifically, in terms of final looks, grading does _not_ really start before shooting.
17:48
troy_s
It is just temporary work.
17:48
irieger
Didn't say anything about the final grading here
17:51
troy_s
Anyways...
17:51
troy_s
Grading = moot point here.
17:52
troy_s
ACEScc is a damn smart grading space. It is basically a well researched set of primaries.
17:52
irieger
the most important point in a movie ;-)
17:52
troy_s
I love grading.
17:52
troy_s
I do. I love imaging.
17:52
irieger
troy_s: I love ACEScc, did some stuff in it
17:52
troy_s
irieger: The whole tristimulus oddness due to models is quite fascinating (and complex I suppose)
17:53
troy_s
irieger: Many folks don't get why manipulations done against primaries are important to consider the base primaries.
17:53
troy_s
ACEScg is that result.
17:54
troy_s
(Which offers solid results. DCI-P3 is another space that apparently offers decent results.)
17:55
irieger
Many folks do it totally wrong. And some vendors encourage them to do so ...
17:55
irieger
Resolve 12 Colorspaces are pretty fucked up and pissed me when I first saw what they did
17:55
troy_s
Amen sister.
17:55
troy_s
Resolve was / is designed for professionals at one point.
17:56
troy_s
So you can bet there is plenty of stuff that still lets someone mangle up things beyond recognition.
17:56
irieger
It is for the masses now I'd say. Just keep it simple at too much points
17:56
troy_s
Indeed. Prosumer. Bigger bucks.
17:56
troy_s
(Prosumer is really just consumer.)
17:57
troy_s
But their stations are still professional.
17:57
irieger
They have some hidden stuff for the big studios, I'm sure they get settings help for the "Advanced" tab in the config which is nowhere documentet but is there internal setting testbet also
17:57
troy_s
Most of the time they have to leave the industrial bits in otherwise the industrial users simply leave. (Hence AVIDs surge in the years following FCPX)
17:57
troy_s
Yep.
17:58
troy_s
Still, I will say it is probably illuminating as hell for someone coming from a kitchen sink app like FCP or some other NLE and going "Whoa"
17:58
troy_s
Hints at a larger process. Larger pipeline. Bigger things.
17:59
irieger
Had an interesting talk with Peter Chamberlain from the Resolve team at IBC. He admited that they tried to make color spaces simple as fuck (he said it different) for people to not have to think...
17:59
troy_s
You can't.
17:59
irieger
that is the problem.
17:59
troy_s
God knows I have watched this unfold over the (sad now decade+) time I have learned about them.
18:00
troy_s
The issue is probably misinformation. Personally, for the _longest_ time it was the single point of failure in my learning.
18:00
troy_s
Just trying to figure out why a wide gamut display is different from a small gamut was a huge leap of cognition.
18:01
irieger
When you select Sony Slog2 as the source and Arri Log-C as the working color space they convert primaries and gamma/log. When you go Log-C to Rec 709 or P3 then they just convert color and clip pretty bad. Or Linear to Log C means you except Linear Alexa Log C primaries. Legit somehow but not documented etc
18:01
troy_s
(I still suggest that understanding RGB as a _relative_ system is the entry point to understanding all of colour.)
18:01
troy_s
Whoa... Reading. Hold.
18:01
troy_s
Well "linear" is a nasty piece of work.
18:02
troy_s
Folks use it as a colour space, but it isn't. It is an attribute, a point that is missed by many.
18:02
troy_s
Language is huge here.
18:02
troy_s
Two types of linear too
18:02
troy_s
Scene linear and display linear, another bit of confusion.
18:03
irieger
troy_s: what I'm saying. It is fucked up. That's why I was so disappointed when I first checked out Resolve 12.
18:03
troy_s
Where scene linear will hold radiometrically linear values, and display linear is "only radiometrically linear up to a certain point then it doesn't really mean much"
18:03
troy_s
How can it not be????
18:03
troy_s
I would love to hear your idea.
18:03
troy_s
Because I still don't see a way. Only way is education. At least while we have RGB.
18:04
irieger
I think baselight is pretty straight forward. You have to learn about color management anyways and there I'd say they got a pretty usable concept
18:04
irieger
Don't say there is one without education thought
18:05
troy_s
Even a basic scene linear to whatever transform is a nightmare.
18:05
irieger
Just fucking it up even more by trying to simplify it? Even worse in my point of view
18:05
troy_s
You heard Huxley's quote on that right?
18:05
irieger
?
18:05
troy_s
Your quote exactly
18:05
troy_s
Hold.
18:06
troy_s
“The Will to Order can make tyrants out of those who merely aspire to clear up a mess.”
18:06
troy_s
Cracks me up every time I read that.
18:06
irieger
As we do research about HDR at our school I asked Peter at IBC/Blackmagic Booth: What color space is used when setting HDR PQ 1000nits as the output Color Space? P3 or Rec 2020?
18:06
troy_s
But agree 198%
18:07
troy_s
I bet he stared.
18:07
troy_s
The Academy peeps I trust.
18:07
irieger
Guess it?
18:07
troy_s
They are ridiculously over the top
18:07
troy_s
All of the Academy folks have massive experience and deep math.
18:07
irieger
Nope. They just do the PQ curve and stay in the color space of the timeline.
18:07
troy_s
With real-world experience.
18:08
irieger
btw. I like the quote.
18:08
troy_s
You are surprised?
18:08
troy_s
The ACES stuff is so well done.
18:08
irieger
nope. But hey, why don't they even document it?
18:08
irieger
btw. we are at the front there https://hdr-2014.hdm-stuttgart.de
18:08
troy_s
Did you read Sayre's post on it regarding non-orthogonal basis vectors (discussion of why primaries matter during manipulation)
18:09
irieger
Our footage was used by the Pixar/ILM guys who did tests for the two first HDR releases with our material ;-)
18:09
troy_s
(Rick Sayre is Pixar's CTO for those not in the know. Ridiculous knowledge and comprehension.)
18:09
irieger
troy_s: not sure, can you give me a link?
18:10
troy_s
Very cool.
18:10
troy_s
It is why this channel is great really.
18:11
troy_s
irieger: I love the whole motion picture pipe because it forces various disciplines to mix, and the result is a much better understanding of models and why.
18:11
troy_s
Elevates work.
18:11
irieger
We sat there as 4 people who worked on this HDR stuff in a presentation at IBC and when they showed a photo from their first tests we recognised the footage directly ;-)
18:11
irieger
That's why I like it too
18:12
troy_s
irieger: You will be able to understand the whole thread https://groups.google.com/forum/m/#!msg/academyaces/9b4VuqPcOHQ/Do6yMSE3uZsJ
18:13
troy_s
Several brilliant peeps in that thread.
18:13
troy_s
Thomas is _deadly_ clever too.
18:13
troy_s
I spoke with him about helping out on the transforms for the Apertus. I think we may be able to work something out.
18:14
troy_s
irieger: As a baseline, whorfin is Sayre.
18:14
troy_s
irieger: Aka "Not wrong"
18:15
irieger
Ok, good to know
18:16
irieger
will look through it in the coming days. I'm just to tired today. Tomorow I have to work on some color science ;-)
18:17
irieger
Or some more sensortuning at first maybe
18:18
irieger
So good night everyone, the letters here are getting bad to read ...
18:18
troy_s
What are you on tomorrow?
18:18
irieger
Some Beta camera ;-)
18:18
troy_s
Night irieger. Looking forward to your return.
18:19
irieger
let's see if I return with some linear ACES examples soon
18:20
troy_s
irieger: Keep in mind...
18:21
troy_s
irieger: Those ACEScg/cc are different primaries.
18:21
troy_s
(I believe they are axis aligned.)
18:21
troy_s
Looking forward to see what comes out. Shoot some IT8s
18:21
irieger
I know there are differences
18:21
troy_s
(Ideally a few in the same shot with different exposures if you can.)
18:22
troy_s
irieger: Do you use the CTL at your school?
18:22
irieger
Will look if we have one or so at our school.
18:22
troy_s
(If not, perhaps you can begin integrating it.)
18:23
irieger
I'm one of the few who really dove in that deep yet. But a production currently should go aces I suggestes
18:23
irieger
(shot on 6 or 7 cameras, VFX project)
18:24
troy_s
https://github.com/ampas/CTL
18:24
irieger
i know
18:24
troy_s
Oh very interesting. Can you keep me in that loop? Would love to watch it.
18:24
troy_s
(The progress)
18:24
irieger
don't know if/when it will be published online. You know, festivals etc.
18:25
irieger
https://github.com/irieger/aces-dev / https://github.com/irieger/CTL
18:25
troy_s
What state is it in? Preproduction?
18:25
irieger
did some small hacks already. but as I said, hacks
18:25
troy_s
LOL
18:25
irieger
postpro
18:25
troy_s
Ironic.
18:25
troy_s
Starting or knee deep?
18:25
irieger
they started comping. don't know exactly
18:26
irieger
just helped a bit with some small stuff. (Said they should go full ACES months ago btw.)
18:27
troy_s
irieger: You know about OCIO's ociolutimage?
18:27
troy_s
(It generates RGB all colour patterns etc.)
18:27
irieger
I know a nuke node for it at least ;-)
18:27
irieger
my ctlrender can do it already so why bother? ;-)
18:31
irieger
so, off now really
18:31
irieger
changed nick to: irieger|away
18:36
danieel
joined the channel
18:54
Mikea
joined the channel
19:02
Mikea
left the channel
19:35
pozitrono
joined the channel
20:26
se6astian|away
changed nick to: se6astian
21:11
se6astian
off to bed
21:11
se6astian
changed nick to: se6astian|away