Current Server Time: 04:27 (Central Europe)

#apertus IRC Channel Logs

2014/10/31

Timezone: UTC


23:01
Rebelj12a
Sorry for the lab bot spam. Getting the ideas simply out helps free up my mind of thoughts.
23:01
Rebelj12a
Alot of them dont have examples yet, im working on it.
23:13
Rebelj12a
Clarification on the port
23:14
Rebelj12a
Sorry clarification for the port included in the Lab
23:49
g3gg0
left the channel
00:06
Rebelj12a
left the channel
00:06
Rebelj12a
joined the channel
00:20
comradekingu
Rebelj12a: you can change the title afterwards. "OSArduino FF" i didnt understand
00:22
Rebelj12a_
joined the channel
00:24
Rebelj12a
left the channel
00:24
Rebelj12a_
changed nick to: Rebelj12a
00:26
comradekingu
aombk: Maybe the cheapest bmcc has a small sensor / people cant afford extreme aperture lenses/ or lighting equipment, or they just leave the whitebalance at default
00:27
comradekingu
Or it could be some sort of colour range issue, or encoding/transcoding issue, since most films i tend to see of it is presented online in one form or the other
00:28
comradekingu
If it just looked natural people wouldnt notice, which is a sociological trait, you pay more attention to the cases where it was odd, enhanced by the urge of filmmakers to try and make it look like only a movie could
00:32
comradekingu
But yes, i think the bmcc is good at making amateur film look amateur, which is a good thing in many ways
00:43
Bertl
off to bed now ... have a good one everyone!
00:43
Bertl
changed nick to: Bertl_zZ
00:45
wescotte
left the channel
01:33
comradekingu
left the channel
05:08
slikdigit_
left the channel
05:49
comradekingu
joined the channel
06:23
Bertl_zZ
changed nick to: Bertl
06:23
Bertl
morning folks!
07:02
philippej
joined the channel
07:13
Bertl
off for now ... bbl
07:13
Bertl
changed nick to: Bertl_oO
08:05
philippej
left the channel
08:50
sb0_
joined the channel
09:21
sb0_
left the channel
10:08
danieel
joined the channel
10:26
PhilCee
joined the channel
12:25
surami
joined the channel
12:26
surami
left the channel
12:29
aombk
thats nice
12:29
aombk
open sauce
13:09
lab-bot
sebastian closed T40: Mexico City TAG Festival Flyer as "Resolved". http://lab.apertus.org/T40
14:16
comradekingu
left the channel
14:49
troy_s
Morns.
15:13
Bertl_oO
changed nick to: Bertl
15:13
Bertl
back now ...
15:20
MikeA
joined the channel
15:24
Bertl
welcome MikeA!
15:29
MikeA
left the channel
15:36
Bertl
cya MikeA!
16:26
intracube
hello
16:26
Bertl
hey
16:26
intracube
hi Bertl :)
16:27
intracube
have there been any further tests on the CMV12000 HDR mode?
16:27
Bertl
AFAIK, not yet
16:27
intracube
ah. I was wondering how far it can be pushed.
16:28
intracube
like fairly linear for shadows and mid-tones, but an extreme slope to capture highlights
16:28
intracube
like lamp filaments, etc
16:28
Bertl
it's always linear (piecewise linear :)
16:29
Bertl
but I get what you mean
16:29
intracube
well, the uppermost piecewise linear slope! :P
16:29
intracube
would allow for some great options in post
16:29
intracube
like do decent physically correct lens flares
16:49
troy_s
intracube: Log curve. :)
16:49
intracube
waits for Bertl to disagree :P
16:50
troy_s
intracube: Pretty sure Bertl will agree that the most ideal PLS mode is a log-approximation.
16:50
troy_s
intracube: It makes exposure a little more fussy, but it is also precisely why the existing high-end cameras record to log.
16:51
troy_s
intracube: Not sure your comment about lens-flares though.
16:52
intracube
troy_s: what about flares?
16:53
troy_s
intracube: Your comment about physically correct flares.
16:53
intracube
oh, meaning in post you'll have access to a wider dynamic range compared to if it was recorded fully linear
16:54
intracube
so any soft-focus/flare effect will be more proportional to the light intensities in the original scene
16:54
troy_s
"fully linear" makes very little sense with current technology levels in cameras.
16:54
troy_s
(Some day it will make great sense. Just not today.)
16:55
troy_s
And you still have a large contingent of bumblefsckery that is trapped in display referred imaging apps thinking it is the only way.
16:55
intracube
if the highlights are clipped, there's no way to distinguish between something bright vs uber-bright in post
16:55
troy_s
There's no way to do that with PLS either.
16:55
troy_s
A typical light will blow out still because the ratios are vast.
16:55
intracube
which I think is why flares done in post on 8bit camcorder footage usually look poor
16:56
troy_s
You almost always have to dim down an exposed blub on a set.
16:56
intracube
troy_s: right, but there will be -more- range than otherwise
16:56
troy_s
Well it is, but that is because most folks haven't a clue what they are doing.
16:56
troy_s
You can do a luminance key to yank values and log them up to relative ranges.
16:57
troy_s
The main issue that flares look like crap by and large (and any other photometric thing like fire) is because people _really_ have no clue about the differences betwene display referred and scene referred imaging.
16:57
troy_s
Really _no clue_. Nor why it is important.
16:57
troy_s
Have a look at a typical fire comp and you get an idea pretty damn quick.
16:58
troy_s
The way a flare interacts with the existing image is very similar to the way that fire adds.
16:58
intracube
but my original point still stands, right?
16:58
troy_s
In particular, the values have got to be composed in a scene-linear format, and you can't use a blind clip on the view transform.
16:59
troy_s
Your point certainly leans toward an awareness of higher dynamic range and why it is important, but it sort of only solves a small puzzle piece in the equation.
16:59
troy_s
As in, I could assure you that a 20 stop PLS curve will still result in crap work if artists are happily churning away on AE.
16:59
troy_s
Because it _really_ is a massive mental leap apparently to flip between the two models.
17:00
troy_s
(and even when you do flip to it for imaging, controlling that transform back to display referred land is rather important for that roll-off you were speaking of.)
17:00
troy_s
(that's not quite as dead-straight forward)
17:00
intracube
wouldn't you just have an inverse transform to turn the PLS curve to as close to linear as possible
17:00
troy_s
That gets you into a working space for certain.
17:01
intracube
then in blender's comp (for example) apply whatever flare effect you want, then finally grade to taste
17:01
troy_s
That gives you linearized values in a scene-ish sort of format. As in zero to infinity.
17:01
intracube
right
17:01
troy_s
But the problem is that you are in a working space that is scene referred and that means that _black and white are creative choices_
17:01
troy_s
So you have an issue here... in particular two issues:
17:01
troy_s
A) What value the artist chooses to set for black and what value for white. This is sort of the clip zone, with rolloffs above and below respectively.
17:02
troy_s
B) How you are going to deal with the roll off to display referred white.
17:02
intracube
of course. but it should give better results than shooting in 'normal' profile on a consumer 8bit DLSR and trying to add a flare in sRGB gamma 2.2 space
17:02
troy_s
B is subtle and more nuanced.
17:02
troy_s
sRGB is junk agree.
17:02
troy_s
You can see why B is a complex issue?
17:02
troy_s
Take for example, scene linear values of something like say... a really ridiculously sun lit house that is green.
17:02
intracube
B is an artistic choice
17:02
troy_s
(Keeping this example stupidly faked)
17:03
intracube
(what happens close to clipping point in display referred land)
17:03
troy_s
Let's pretend that our scene data has a hot spot value that we swatch at say RGB values of 21 103 35
17:03
troy_s
As in
17:03
troy_s
21.8 103.712 35.1932
17:03
troy_s
Okie?
17:04
intracube
yes
17:04
troy_s
Now let's say intracube the grader decides that he wants his white clip to live at something around say.... 35.
17:04
troy_s
So our white point is now 35 for our transform back to display referred land. That is, 35.0 we scale to 1.0 display referred.
17:04
troy_s
The 103 is a unique problem...
17:05
troy_s
Because our transition to white is a hard clip.
17:05
troy_s
where we might want a roll off
17:05
intracube
you'd scale the whole range (some simple exposure setting) down to fit
17:05
troy_s
Not always.
17:05
intracube
then apply non-linear curve (artistic choice)
17:05
troy_s
You might want that 35 level to be blooming to white
17:05
troy_s
as in a typical film bloom
17:05
troy_s
which is actually a very unique intensity transform
17:05
g3gg0
joined the channel
17:06
troy_s
Something like "keep values at their value up to display referred 0.9
17:06
troy_s
but as we get above that 0.9
17:06
troy_s
we actually want _all_ the wells to saturate
17:06
troy_s
that is
17:07
troy_s
If we pretend we shot that green barn on film... and let's pretend it is a mythical narrow band green... the value that perfectly saturates the G register in digital.
17:07
troy_s
In film. if that barn were _ridiculously_ hot, the film would burn out to white.
17:07
troy_s
Which is _not_ what you get with a hard digital clip, curves or otherwise.
17:07
troy_s
So what you really are asking for is a 3D LUT, where the values remain 1:1 at to say, 0.9
17:07
troy_s
but when green steps over 0.9 to 0.91 perhaps, then we want to also rapidly bring up blue and red.
17:08
troy_s
Otherwise, no matter _how_ hot something is, it will never blow out.
17:08
troy_s
You can demo this very easily in Blender by creating a fully saturated R G or B face or use a neutral white and use a fully saturated R G or B light
17:08
troy_s
and watch how the default blind sRGB transform works.
17:08
intracube
troy_s: of course. this is something that a grader/colourist would do instinctively
17:08
troy_s
The _only_ way you get a 'blow out to white' is if it is non-fully-saturated.
17:09
troy_s
Not really.
17:09
troy_s
It's a rather nuanced detail, and given that many folks likely don't see the issue, the chances they will try to emulate a more correct LUT is low.
17:10
troy_s
(most folks around Blender for example, think the default LUT is "blowing out" values above the display referred land. This is not the case. It just happens that very hot intensities happen to push even moderately saturated R G B to clip.)
17:10
intracube
isn't this just 'coping with highlight rolloff 101'??
17:10
troy_s
Not really.
17:10
troy_s
It's a very nuanced view of that issue.
17:10
troy_s
Highlight rolloff is one thing.
17:11
troy_s
But not seeing a clear and unique transform between scene referred and display referred is the real issue.
17:11
troy_s
Where display referred one can almost always assume (except for intermediate protocols like broadcast YCbCr with scale and offsets) that display referred 0 and 1 are black and white respectively.
17:11
intracube
idk, managing how the 3 channels approach white should be well understood
17:12
troy_s
Except many folks screw around in half assed display referred models.
17:12
troy_s
Which obfuscates the real transform.
17:12
troy_s
And you certainly can't just do uniform curves
17:12
troy_s
at all.
17:12
troy_s
because that would completely squonk your looks.
17:13
intracube
troy_s: isn't this what you're basically describing: https://lh4.googleusercontent.com/-jkqLEuzpcBM/VFPQzlXNUVI/AAAAAAAAA7I/ilEikxAX8vA/w1039-h696-no/sun_highlights.png
17:13
troy_s
Except more pronounced.
17:13
troy_s
As in the object is more saturated along a single channel.
17:14
troy_s
It makes the issue much more complex.
17:14
troy_s
(Which can be carpet bombed with a good 3D LUT)
17:14
intracube
oh ok
17:14
troy_s
You can clearly see the issue if you think of a whacked channel.
17:14
intracube
yep
17:14
troy_s
like the 20 100 30
17:14
troy_s
Not the most easy thing to deal with curves
17:14
troy_s
At all.
17:15
troy_s
You need sort of a converging film-ish 3D LUT (and it must be 3D because the single channel must slide the other channels)
17:15
intracube
yes, it needs something other than RGB curve adjustment for sure
17:15
troy_s
You _could_ probably manually mangle the crap out of curves per shot
17:15
troy_s
but I would think this is unmanageable after a few shots.
17:22
troy_s
intracube: I just emailed a buddy on a tool to generate those sorts of LUTs.
17:26
troy_s
Not sure an easy one exists really. I mean we could always load a Gragner Rainbow, manipulate, then generate the 3D LUT off of it.
17:26
slikdigit_
joined the channel
17:43
slikdigit_
left the channel
17:51
slikdigit_
joined the channel
17:51
slikdigit
left the channel
17:51
slikdigit_
changed nick to: slikdigit
17:52
slikdigit_
joined the channel
17:52
slikdigit
left the channel
17:52
slikdigit
joined the channel
17:52
troy_s
intracube: Did you know that the 7D can record raw now offspeed via ML?
17:55
Bertl
offspeed means?
17:55
intracube
^ +1 offspeed?
17:56
intracube
and nope, but I don't have canon dlsrs
17:56
intracube
when is ML going to hack Nikon FW?
17:57
intracube
but recent nikons can record clean/raw over HDMI
17:57
Bertl
sure about that?
17:58
Bertl
because raw usually involves more than 8 bit :)
17:59
intracube
Bertl: I mean uncompressed, not pre-debayered
18:00
Bertl
still probably useless with 8 bit depth, no?
18:00
intracube
no
18:02
intracube
the h.264 encoder in my D610 isn't too bad
18:02
intracube
but HDMI to an Atomos is obviously going to be much better
18:03
intracube
Bertl: it's not broadcast quality, true. but then most TV broadcasts aren't broadcast quality these days :P
18:03
intracube
the BBC don't transmit 'HD' at all (by their own internal definition)
18:04
troy_s
Bertl: 14 bit
18:05
troy_s
Bertl: On the 7D.
18:05
intracube
and offspeed means?
18:05
Bertl
troy_s: yeah, but unlikely via HDMI
18:09
troy_s
Offspeed = != 24/25FPS
18:09
troy_s
so 2.5x for 60
18:12
Bertl
ah, so probably slower then
18:12
intracube
troy_s: do you mean native 60fps?
18:14
troy_s
Yes
18:14
troy_s
With aspect ratio native 2.67:1
18:14
troy_s
Rather awesome really.
18:14
intracube
2.67:1? eww!
18:14
intracube
;)
18:14
troy_s
Figure why waste pixels really
18:15
troy_s
Unless you need overscan for some vfx.
18:15
troy_s
but it has some fine granularity actually
18:15
intracube
or you prefer tall-screen format
18:16
troy_s
alexML_: Ping.
18:16
intracube
thinks academy/silent aperture is hugely underrated
18:17
troy_s
How come?
18:17
troy_s
Academy aperture is lovely.
18:17
intracube
troy_s: because for the last 10-15 years widescreen has been pushed hugely
18:18
intracube
by tv manufacturers, particularly
18:19
troy_s
More 21:9 displays out there now than there ever has been.
18:19
intracube
ask most people on the street and they'll probably say that 16:9/2.39:1 are inherently 'wider'
18:20
intracube
'same height as 4:3 but more at the sides' - so wrong
18:21
intracube
personally, I like all the common formats. depending on film, genre, whatever
18:24
intracube
troy_s: 21:9 is still quite niche
18:28
troy_s
What do you mean they will say wider?
18:28
troy_s
21:9 is wider. So I'm a little confused as to your meaning (I know you know what you mean, but I just lost the message)
18:29
intracube
21:9 is wider than what?
18:29
troy_s
Than 16:9
18:29
intracube
a ratio is a ratio, no actual height or width involved
18:29
troy_s
Erm.
18:29
intracube
just conventions
18:29
intracube
you could argue 16:9 is wider than 12:9 (4:3)
18:29
troy_s
I'll still say that when we refer to 21:9, we can easily say that it will be wider than the 16:9
18:30
troy_s
because of the implied X
18:30
intracube
or that 16:12 (4:3) is taller than 16:9
18:30
troy_s
as in 21:9(x) is wider than 16:9(x)
18:30
troy_s
Fair point?
18:30
troy_s
10 seconds... 200 megs!
18:30
troy_s
Yikes. Going to need to buy a shitload more CF cards.
18:30
intracube
common height is an assumption
18:31
troy_s
Well even with common height, I still don't understand.
18:31
intracube
try common width
18:31
troy_s
21:9(h) is still wider than 16:9(h)
18:31
troy_s
Hrm.
18:31
troy_s
Except that even when discussing BluRay etc.
18:31
troy_s
It isn't the case.
18:31
intracube
16:9 vs 16:12
18:32
troy_s
as there is a squish in there to gain quality
18:32
troy_s
(on 99% of modern titles)
18:32
intracube
troy_s: blu-ray is a poor format. 16:9 all the way. no proper provision for 2.39 or 1.33
18:32
intracube
other than hard letter/pillarboxing
18:32
intracube
= wasted pixels
18:33
intracube
even DVD was aspect switchable
18:33
troy_s
No wasteds.
18:33
troy_s
Enhanced == non squares.
18:33
intracube
afaik blu-rays PAR is always 1:1
18:33
troy_s
I have several that are enhanced.
18:34
intracube
blu ray??
18:34
troy_s
Not sure if there are more than say, 1.33
18:34
troy_s
Yes
18:34
troy_s
Not entirely sure the nuances. Was quite familiar with the 1.33 to 16:9 enhanced.
18:34
troy_s
But no real clue on Blu-Ray.
18:35
intracube
can you remember what they are?
18:35
troy_s
I think there is even a push for 21:9 enhanced
18:35
troy_s
But I might be wrong
18:35
troy_s
(Pretty sure it isn't there yet)
18:35
troy_s
Into the Void maybe?
18:36
bassam__
joined the channel
18:37
slikdigit
left the channel
18:37
bassam__
changed nick to: slikdigit
18:37
slikdigit
left the channel
18:37
slikdigit
joined the channel
18:37
intracube
troy_s: let me know if you find any more info about that
18:38
troy_s
intracube: Sure. I'm sure Google is wiser than I though.
18:38
intracube
I thought BR-Video was always 16:9 for HD, but had provision for either 4:3/16:9 SD
18:38
troy_s
Not entirely sure.
18:38
troy_s
And "enhanced" could very well be bunko crap
18:38
troy_s
(As compared to DVDs which had an explicit meaning at one point)
18:39
intracube
right. DVD 'enhanced widescreen' usually meant anamorphic rather than hard letterboxed
18:39
intracube
troy_s: the HDMI spec would answer this
18:40
troy_s
Where is Andrew Baldwin
18:40
troy_s
He should skulk in here dammit.
18:40
intracube
if there's provision for 2.39:1 signalling to the TV/display
18:40
troy_s
???
18:40
troy_s
Why?
18:40
troy_s
I'm sure there is, but why would it answer it?
18:40
intracube
if not, then no way for the BR player to tell the TV to adjust the image
18:42
troy_s
Oh. I am certain 21:9 isn't there because there is a movement to add it, but I'm not entirely sure that some don't have some form of squeeze.
18:42
intracube
can't see how that would work
18:42
troy_s
Only know I have an enhanced (or a couple) of BRs.
18:42
troy_s
Huh?
18:42
intracube
is likely marketing guff :P
18:42
troy_s
How a 21:9 squeeze would work?
18:42
troy_s
Pretty simple really.
18:44
troy_s
g3gg0: Do you have Andrew's email?
18:45
intracube
and hi bassam / slikdigit / Blenderer :)
18:48
troy_s
WHOLLY CRAP
18:48
troy_s
alexML_ / g3gg0 I am blown away. It has been so long since I futzed with ML but 60 fps raw on my older 7D is ...
18:48
troy_s
I'm blown away.
18:52
slikdigit
heya intracube :)
18:53
comradekingu
joined the channel
18:54
slikdigit
troy_s, I think I'm actually contemplating reading CIE publications....
18:54
troy_s
slikdigit: OMFG
18:54
troy_s
slikdigit: May I ask why?
18:54
slikdigit
well
18:54
troy_s
I'll say first and outright, CIE stuffs is sort of level three.
18:54
slikdigit
most of the 'for artists' stuff that's written on the topic rehashes stuff I already know or found out "the hard way"
18:54
troy_s
I'd suggest solidifying your second level first, because much of the CIE stuff really needs a good carpet bombing of basis.
18:55
troy_s
What in particular?
18:55
slikdigit
I'm interested in the math
18:55
troy_s
(I'm asking because frankly, if much of that is already known, I'd need to change my leaning toward middle on.)
18:55
troy_s
What part of the math?
18:55
slikdigit
I did find some stuff online about converting between XYZ and RGB primaries for instance
18:55
troy_s
To be brutally honest, if you can wrap your head around the following two sets of general knowledge, the rest is dead simple.
18:56
troy_s
1) The difference between scene referred data and display referred.
18:56
slikdigit
so there's a fair bit I'm going to read before then. I'm dreading the cie stuff because I think it'll be really dry
18:56
troy_s
2) That primary lights for RGB are different per space.
18:56
troy_s
I have a few tomes you might want to peruse.
18:56
slikdigit
oh cool
18:56
troy_s
It generally is quite dry
18:56
troy_s
but the math is pretty simple
18:57
troy_s
That is, _any_ tri color space can be changed from one to the other via a simple 3x3 matrix
18:57
troy_s
That part is clear I assume?
18:57
slikdigit
1) I don't get yet (terminology lapse), but 2) I do
18:57
troy_s
(Tri-color spaces include XYZ)
18:57
aombk
time is not moving as fast as i would like
18:57
aombk
towards the beta
18:57
slikdigit
XYZ is kinda cool
18:57
troy_s
So... for example... if we look at Lindbloom's D65 sRGB to XYZ transform
18:57
troy_s
It is:
18:57
aombk
Bertl, is there something i can do?
18:58
slikdigit
I wonder if renderers should start using it internally (or if some already do)
18:58
troy_s
0.4124564 0.3575761 0.1804375
18:58
troy_s
0.2126729 0.7151522 0.0721750
18:58
troy_s
0.0193339 0.1191920 0.9503041
18:58
troy_s
I'll explain that in a second.
18:58
troy_s
There's an interesting nuance you probably aren't aware of
18:58
troy_s
that will blow your mind
18:58
troy_s
(I assure you)
18:58
Bertl
aombk: there is a lot of stuff which needs to be done
18:58
troy_s
See that 3x3 matrix?
18:58
troy_s
That's the default D65 sRGB to XYZ transform. (Which also means it is totally wrong for _any_ ICC based workflow)
18:59
troy_s
That will take RGB and transform it to XYZ. But there's some very interesting stuff in there slikdigit.
18:59
troy_s
See the MIDDLE row?
18:59
troy_s
Does that look familiar at all to you?
18:59
Bertl
aombk: currently most work needs to be done on designing the PCBs
18:59
troy_s
slikdigit: http://en.wikipedia.org/wiki/Rec._709#Luma_coefficients
18:59
aombk
ok then i better not speed up time
19:00
troy_s
slikdigit: See the overlap?
19:00
aombk
oh slikdigit, hi!
19:00
Bertl
aombk: but a lot of other stuff needs to be prepared, see the lab
19:01
slikdigit
hia aombk
19:02
se6astian|away
changed nick to: se6astian
19:02
slikdigit
troy_s, cool stuff, and I hadn't seen that wiki page before, silly me
19:02
Bertl
aombk: and slikdigit needs some cosntant motivation to work on the Beta system image :) maybe help him with some music?
19:02
troy_s
slikdigit: Do you see what I am getting at yet?
19:02
se6astian
good evening
19:03
Bertl
evening se6astian!
19:03
troy_s
That middle row is the luminance coefficients for sRGB / 709 (sRGB and 709 have identical primary lights so the transform is identical as is the gamut)
19:03
troy_s
What you see in that matrix is actaully the XYZ positions.
19:03
troy_s
In columnar format.
19:04
troy_s
So Red's XYZ is 0.4124564, 0.2126729, and 0.0193339
19:04
troy_s
Luminance (the middle row) sums to 1.0 because... CIE standard observer model (and the eyeball thanks to the pupil) is display/output referred.
19:04
troy_s
(display/device/output referred are all synonymous - values have 'special' meaning at 0 and 1.0, that is there is a minimum and a maximum)
19:04
aombk
Bertl, im trying to but he doesnt like it
19:04
slikdigit
oh neat, yeah, makes sense; but I don't know yet what display vs. scene referred means
19:05
troy_s
slikdigit: Think of the easiest litmus test: IF your model has a special meaning to 1.0 (such as white) THEN you are using a display / device / output referred model.
19:05
troy_s
slikdigit: IF your model has no special meaning to 1.0, THEN you are likely using a scene referred model.
19:05
Bertl
aombk, slikdigit: sorry, I mean skinkie_ (auto completion failure)
19:06
slikdigit
Bertl, ahah! I was a bit confused there
19:06
troy_s
slikdigit: This is _even_ easier to explain to someone familiar with 3D because you can emulate it very quickly. Drop four point lights in front of a cube and use a raytracer. Sample the values.
19:06
aombk
Bertl, and i am not yet able to browse in the lab and understand whats what. i know i have to watch the whole lab perentation video
19:07
Bertl
learn it, motivate folks to use it properly (same goes for the wiki) and you're already helping out a lot
19:07
dmjnova
joined the channel
19:07
Bertl
wb dmjnova!
19:08
dmjnova
hi Bertl
19:08
troy_s
slikdigit: If you try that you will see values like 13 or even 100 or more.
19:09
troy_s
slikdigit: An internal raytracer is scene referred, and the values extend from zero to infinity representing the scene, and obviously all radiometrically correct (aka linear model)
19:09
troy_s
slikdigit: sense?
19:10
slikdigit
right, so I'm guessing that's scene referred values since you can't actually display it
19:10
troy_s
Yep. And that's the part so many around many parts fail to get.
19:10
troy_s
The transform from scene referred to display referred is a creative choice.
19:10
troy_s
That 'how' and 'why' you choose to do something.
19:11
troy_s
Black and white are given meaning in display referred, and they don't exist in scene referred.
19:11
troy_s
There is just a minimum and maximum, which isn't always your black and white point.
19:11
troy_s
In Blender for example, if you flip open the color management panel in properties, you can turn on "Use Curves" and map that point light setting
19:11
troy_s
Such that the HDR ranges are rolled into range.
19:12
slikdigit
right, that pretty much matches what blender does
19:12
troy_s
The default in Blender (and Nuke etc.) is a rather blind and ignorant sRGB transform, which is _not_ a suitable transform.
19:12
troy_s
Most colorists add a step in there before worrying about sRGB
19:12
troy_s
scene -> a log variant -> output
19:12
troy_s
the log is sort of a base entry point for grading
19:13
troy_s
because it takes scene referred values (zero to infinity) and maps them into display referred values generally (0..1)
19:13
troy_s
and applies a rough curve
19:13
slikdigit
and you don't 'really' care what the curve is because you're going to tweak the look anyway
19:13
troy_s
if your scene extends rather high into values, the log will stuff those down lower, so some tweaking may be needed to find the middle grey exposure.
19:13
troy_s
Exactly.
19:13
troy_s
And _all_ of that creative stuff
19:13
troy_s
Should be done on the view / display transform, NOT the data.
19:14
troy_s
You only bake to the data when you are outputting generally.
19:14
troy_s
This has a bunch of upside.
19:14
troy_s
For example, say you are doing a grade in WfE
19:14
troy_s
and you have several shots that are similar
19:14
troy_s
maybe 20
19:14
troy_s
grading them all is a royal pain in the ass if you have to manually get yourself in order.
19:14
slikdigit
but.... I'm guessing you *should* be able to apply a single different transform to get an equivelent (depending on the characteristics of the displays) look on other displays (like a film print vs. computer monitor)
19:14
troy_s
What you _could_ do to make it easier is grade a single shot that represents most of the shots in a similar context as an entry.
19:15
troy_s
Yep.
19:15
troy_s
If you grade a frame of WfE, you could A) use the CDL and apply it to all the shots (this too should be stored in a file generally referred to as LUTs but might not be a LUT per-se)
19:15
troy_s
B) Use more sophisticated primary and secondary and bending transforms
19:16
aombk
maybe i should make the apertus anthem or the beta theme or something :P
19:16
troy_s
In the case of B, you can take your node setup, and use an RGB image with all possible colors in it, or slightly less, and do a 'before' and 'after' image.
19:16
troy_s
From those two images, you can generate a complex 3D LUT, which can be slapped onto your other 30 shots
19:16
troy_s
as an entry point.
19:16
wescotte
joined the channel
19:17
slikdigit
interesting
19:17
slikdigit
kinda wish I made to siggraph this year
19:17
slikdigit
we could have had a fun talk/demo
19:17
troy_s
slikdigit: Oh it would have been a blast.
19:17
troy_s
I was _really_ pissed when I missed the last one here (was on some nightmare)
19:17
se6astian
btw we are the most funded project in austria on indiegogo: https://www.indiegogo.com/explore?filter_browse_balance=true&filter_country=CTRY_AT&filter_quick=most_funded
19:18
troy_s
slikdigit: Looking for something for you
19:18
troy_s
slikdigit: Jeremy used that exact system I just described to generate the ACES transforms a while back
19:18
se6astian
by a small margin of 123,986 € :)
19:19
troy_s
slikdigit: Once you wrap your head around the stuff, you can sort of see why I was keen on getting more correct grading structure in place in B.
19:19
slikdigit
yeah
19:20
slikdigit
brb
19:20
baldand
troy_s: ping
19:20
troy_s
Bertl: Woop.
19:20
troy_s
damn
19:20
troy_s
baldand: Woop.
19:20
troy_s
baldand: Great to see you here. Had no clue you skulked
19:20
troy_s
baldand: Amazing work on the ML Raw Viewer
19:21
troy_s
REALLY amazing.
19:21
baldand
troy_s: thanks
19:21
troy_s
baldand: I was hoping I might be able to twist your arm to integrate two libraries that will make your world a hell of a lot easier.
19:21
troy_s
baldand: In particular, OCIO
19:21
troy_s
baldand: So that we can use the plethora of LUTs and matrix transforms out there.
19:21
troy_s
baldand: But to a certain degree, OIIO as well, so that we can dump to DPX / EXR etc.
19:22
troy_s
baldand: Familiar with those two libs (They have python bindings as well)
19:22
baldand
troy_s: I'm not against if it can be done with good performance. Would be easier to get code contributions though. My dev time is limited
19:23
troy_s
baldand: It has GPU previews, with CPU for higher grade.
19:23
troy_s
(namely tetrahedral interpolation on 3D LUTs.)
19:23
baldand
troy_s: I've been trying to keep as much as possible on GPU, AMaZE being the ain exception
19:23
baldand
main..
19:24
troy_s
baldand: GPU is great for previews, sucks for quality generally.
19:24
troy_s
(as many of the higher quality interpolations aren't available)
19:24
troy_s
But you can flip back and forth via OCIO painlessly.
19:24
troy_s
(No code overhead)
19:24
slikdigit
left the channel
19:26
baldand
troy_s: the thing is, GPUs just have way more performance than CPUs
19:27
troy_s
Sure. But they have always had tradeoffs.
19:27
troy_s
Not everyone cares about performance, and doubly so when it comes to material that needs to be effected etc.
19:27
slikdigit
joined the channel
19:27
troy_s
For an offline? Absolutely (as with any playback)
19:28
troy_s
but an online? You want the highest grade no matter what the performance cost. Especially given you are only conforming / going to archival once.
19:28
troy_s
slikdigit: Damn. Can't find that link. It's a pretty easy node setup to rig it up.
19:28
troy_s
slikdigit: You could generate those LUTs very easily.
19:36
dmjnova
troy_s: from my understanding you usually *can* program the GPU to give you quality but you sacrifice a significant amount of the performance gain.
19:37
troy_s
dmjnova: Possibly. Not too familiar with that aspect, other than the GLSL shaders seem to work fine for some algos and others not so much.
19:37
troy_s
dmjnova: (CUDA / OpenCL being another creature altogether of course)
19:41
dmjnova
I'm talking specifically about CUDA and OpenCL
19:42
dmjnova
and the single/double precision calculations
19:42
troy_s
dmjnova: Most of the accel code I see is shader
19:42
dmjnova
ah
19:55
lab-bot
sebastian reopened T41: Create crowd funding backer FAQ article as "Open". http://lab.apertus.org/T41
19:58
intracube
left the channel
20:12
intracube
joined the channel
20:13
bassam__
joined the channel
20:15
slikdigit
left the channel
20:15
bassam__
changed nick to: slikdigit
20:15
slikdigit
left the channel
20:15
slikdigit
joined the channel
20:20
lab-bot
sebastian-test committed rBH52cd73b4d336: added new backplate variant and top plate riser (authored by sebastian-test).
20:27
dmjnova
left the channel
20:36
lab-bot
left the channel
20:36
lab-bot
joined the channel
20:38
dmjnova
joined the channel
20:49
lab-bot
sebastian created T103: update header bar that connects our platforms together. http://lab.apertus.org/T103
20:57
Rebelj12a2
joined the channel
20:57
Rebelj12a2
left the channel
20:58
bassam__
joined the channel
20:59
slikdigit
left the channel
20:59
bassam__
changed nick to: slikdigit
20:59
slikdigit
left the channel
20:59
slikdigit
joined the channel
21:45
bassam__
joined the channel
21:47
slikdigit
left the channel
21:52
bassam__
left the channel
21:55
dmjnova
left the channel
22:02
comradekingu
left the channel
22:19
dmjnova
joined the channel