Current Server Time: 16:42 (Central Europe)

#apertus IRC Channel Logs

2015/06/14

Timezone: UTC


00:04
wescotte
left the channel
00:36
wescotte
joined the channel
00:44
fsteinel
joined the channel
00:47
fsteinel_
left the channel
00:59
g3gg0
left the channel
01:22
wescotte_
joined the channel
01:23
wescotte
left the channel
01:58
wescotte_
left the channel
02:33
wescotte_
joined the channel
03:41
ItsMeLenny
joined the channel
03:43
wescotte_
left the channel
04:37
Bertl
off to bed now ... have a good one everyone!
04:37
Bertl
changed nick to: Bertl_zZ
06:27
niemand
joined the channel
06:55
niemand
left the channel
07:27
niemand
joined the channel
07:46
niemand
left the channel
08:48
g3gg0
joined the channel
08:50
intracube
left the channel
10:07
intracube
joined the channel
13:01
comradekingu
left the channel
13:30
Bertl_zZ
changed nick to: Bertl
13:30
Bertl
morning folks!
13:48
comradekingu
joined the channel
14:05
aombk3
joined the channel
14:10
aombk2
left the channel
15:18
ItsMeLenny
left the channel
19:19
se6astian|away
changed nick to: se6astian
19:21
se6astian
I am back!
19:21
se6astian
good evening
19:21
Bertl
\o/ wb
19:22
troy_s
Bertl: Can you concoct a method to determine the strictly linear region of the sensors?
19:23
troy_s
Bertl: (for any given set of variables such as voltage etc.)
19:24
se6astian
thanks, ordering oshpark now as requested
19:27
Bertl
troy_s: recently a colleague suggested to use an integrating sphere for such tests
19:29
troy_s
Bertl: Explain?
19:29
troy_s
I ask because it is important for the resultant generation of the conversion LUTs.
19:30
troy_s
Using only the linear region is going to yield entirely unacceptable results for both post production and photographers as the highlight roll off won't be terribly accurate.
19:31
troy_s
So my desire is to craft a process that generates accurate 3D LUTs across the entire sensor range, for each of the ISO gain settings etc. Similar to what Arri does for the Alexa LUTs.
19:31
Bertl
se6astian: thanks!
19:31
Bertl
I think the process can be fairly automated
19:32
troy_s
Under controlled lighting
19:33
troy_s
My gut says a very good 3D LUT can be generated off of several test charts that pin each of the sensitivity regions
19:33
troy_s
Not sure how to actually generate the data from the chart positions (simplistically say, a chart at linear region, a chart near floor, and a chart near head.)
19:34
troy_s
Probably plausible doing it in subsequent shots using ND or netting lights down, doesn't matter tremendously here.
19:34
troy_s
Merging the data is where it gets murky here. How to interpolate from one chart data set to the next.
19:35
troy_s
Basically, in post (and why highlight recovery is a bad idea) one wants accurate RGB representations across the entire range. Matrix / single linear region chart reads aren't going to deliver this.
19:37
Bertl
sounds like you would like an Early Beta ASAP :)
19:37
troy_s
Well I would prefer it if a big brainer can help me come up with the process first.
19:38
troy_s
Bertl: Any ideas on how to merge a series of charts?
19:44
Bertl
well, the chart is supposed to give correct results (after processing) under certain circumstances (within the range)
19:45
Bertl
it will result in a volume which maps (R,G,B) tuples to (X,Y,Z)
19:45
Bertl
(in the most generic form)
19:46
Bertl
so, given that all charts are within the range (no clipping) but differently illuminated you should get the same mapping, no?
19:47
Bertl
so I would say, the resulting transformations can be combined or averaged
19:47
slikdigit_
joined the channel
19:50
troy_s
Bertl: Except the sensor is nonlinear in regions
19:50
troy_s
In particular, that upper edge of importance.
19:51
troy_s
So what I am hoping for is a 3D LUT that accurately models both that curve and the curve near the toe such that we get accurate transforms.
19:51
Bertl
shouldn't matter if the transformation is correct
19:51
Bertl
don't forget, the non-linear range is only problematic for matrices
19:51
troy_s
Exactly
19:51
Bertl
(which cannot compensate for that easily)
19:52
troy_s
The issue is that we can only sample the chart at a given intensity range.
19:52
troy_s
So if we have a chart at say, the range the display referred paper can represent, then another chart near the edge of the sensor head
19:52
troy_s
We have two charts worth of data to merge into a 3D LUT.
19:53
troy_s
The latter representing that critical highlights range.
19:53
Bertl
yes, and both will give an overlaping volume
19:53
troy_s
The problem is feathering between the two 3D LUTs.
19:53
troy_s
Exactly.
19:54
Bertl
as I said, if the process (chart to transformation) is not flawed, the overlaping regions should be (almost) identical
19:54
troy_s
Somehow interpolating those, because they obviously won't be a precise match such that we could simply concatenate after two truncs.
19:55
troy_s
Hrm... Maybe a simple truncation would suffice. Was hoping for something a tad more elegant.
19:55
Bertl
average?
19:56
troy_s
Linear average across the 3D LUT values for a curved range where the seams are?
19:56
Bertl
if they differ significantly, the mapping process is flawed
19:56
troy_s
Well they will be close.
19:57
Bertl
you can also fade one into the other
19:57
troy_s
That is exactly the idea. Feather the seams.
19:57
troy_s
More dense (say a series of bracketed shots) would yield more accurate results.
20:27
intracube
left the channel
20:35
intracube
joined the channel
21:11
se6astian
time for bed
21:11
se6astian
good night
21:11
se6astian
changed nick to: se6astian|away
21:30
jucar
left the channel
21:44
jucar
joined the channel
23:04
slikdigit_
left the channel
23:54
intracube
left the channel