Current Server Time: 22:20 (Central Europe)

#apertus IRC Channel Logs

2015/06/14

Timezone: UTC


01:04
wescotte
left the channel
01:36
wescotte
joined the channel
01:44
fsteinel
joined the channel
01:47
fsteinel_
left the channel
01:59
g3gg0
left the channel
02:22
wescotte_
joined the channel
02:23
wescotte
left the channel
02:58
wescotte_
left the channel
03:33
wescotte_
joined the channel
04:41
ItsMeLenny
joined the channel
04:43
wescotte_
left the channel
05:37
Bertl
off to bed now ... have a good one everyone!
05:37
Bertl
changed nick to: Bertl_zZ
07:27
niemand
joined the channel
07:55
niemand
left the channel
08:27
niemand
joined the channel
08:46
niemand
left the channel
09:48
g3gg0
joined the channel
09:50
intracube
left the channel
11:07
intracube
joined the channel
14:01
comradekingu
left the channel
14:30
Bertl_zZ
changed nick to: Bertl
14:30
Bertl
morning folks!
14:48
comradekingu
joined the channel
15:05
aombk3
joined the channel
15:10
aombk2
left the channel
16:18
ItsMeLenny
left the channel
20:19
se6astian|away
changed nick to: se6astian
20:21
se6astian
I am back!
20:21
se6astian
good evening
20:21
Bertl
\o/ wb
20:22
troy_s
Bertl: Can you concoct a method to determine the strictly linear region of the sensors?
20:23
troy_s
Bertl: (for any given set of variables such as voltage etc.)
20:24
se6astian
thanks, ordering oshpark now as requested
20:27
Bertl
troy_s: recently a colleague suggested to use an integrating sphere for such tests
20:29
troy_s
Bertl: Explain?
20:29
troy_s
I ask because it is important for the resultant generation of the conversion LUTs.
20:30
troy_s
Using only the linear region is going to yield entirely unacceptable results for both post production and photographers as the highlight roll off won't be terribly accurate.
20:31
troy_s
So my desire is to craft a process that generates accurate 3D LUTs across the entire sensor range, for each of the ISO gain settings etc. Similar to what Arri does for the Alexa LUTs.
20:31
Bertl
se6astian: thanks!
20:31
Bertl
I think the process can be fairly automated
20:32
troy_s
Under controlled lighting
20:33
troy_s
My gut says a very good 3D LUT can be generated off of several test charts that pin each of the sensitivity regions
20:33
troy_s
Not sure how to actually generate the data from the chart positions (simplistically say, a chart at linear region, a chart near floor, and a chart near head.)
20:34
troy_s
Probably plausible doing it in subsequent shots using ND or netting lights down, doesn't matter tremendously here.
20:34
troy_s
Merging the data is where it gets murky here. How to interpolate from one chart data set to the next.
20:35
troy_s
Basically, in post (and why highlight recovery is a bad idea) one wants accurate RGB representations across the entire range. Matrix / single linear region chart reads aren't going to deliver this.
20:37
Bertl
sounds like you would like an Early Beta ASAP :)
20:37
troy_s
Well I would prefer it if a big brainer can help me come up with the process first.
20:38
troy_s
Bertl: Any ideas on how to merge a series of charts?
20:44
Bertl
well, the chart is supposed to give correct results (after processing) under certain circumstances (within the range)
20:45
Bertl
it will result in a volume which maps (R,G,B) tuples to (X,Y,Z)
20:45
Bertl
(in the most generic form)
20:46
Bertl
so, given that all charts are within the range (no clipping) but differently illuminated you should get the same mapping, no?
20:47
Bertl
so I would say, the resulting transformations can be combined or averaged
20:47
slikdigit_
joined the channel
20:50
troy_s
Bertl: Except the sensor is nonlinear in regions
20:50
troy_s
In particular, that upper edge of importance.
20:51
troy_s
So what I am hoping for is a 3D LUT that accurately models both that curve and the curve near the toe such that we get accurate transforms.
20:51
Bertl
shouldn't matter if the transformation is correct
20:51
Bertl
don't forget, the non-linear range is only problematic for matrices
20:51
troy_s
Exactly
20:51
Bertl
(which cannot compensate for that easily)
20:52
troy_s
The issue is that we can only sample the chart at a given intensity range.
20:52
troy_s
So if we have a chart at say, the range the display referred paper can represent, then another chart near the edge of the sensor head
20:52
troy_s
We have two charts worth of data to merge into a 3D LUT.
20:53
troy_s
The latter representing that critical highlights range.
20:53
Bertl
yes, and both will give an overlaping volume
20:53
troy_s
The problem is feathering between the two 3D LUTs.
20:53
troy_s
Exactly.
20:54
Bertl
as I said, if the process (chart to transformation) is not flawed, the overlaping regions should be (almost) identical
20:54
troy_s
Somehow interpolating those, because they obviously won't be a precise match such that we could simply concatenate after two truncs.
20:55
troy_s
Hrm... Maybe a simple truncation would suffice. Was hoping for something a tad more elegant.
20:55
Bertl
average?
20:56
troy_s
Linear average across the 3D LUT values for a curved range where the seams are?
20:56
Bertl
if they differ significantly, the mapping process is flawed
20:56
troy_s
Well they will be close.
20:57
Bertl
you can also fade one into the other
20:57
troy_s
That is exactly the idea. Feather the seams.
20:57
troy_s
More dense (say a series of bracketed shots) would yield more accurate results.
21:27
intracube
left the channel
21:35
intracube
joined the channel
22:11
se6astian
time for bed
22:11
se6astian
good night
22:11
se6astian
changed nick to: se6astian|away
22:30
jucar
left the channel
22:44
jucar
joined the channel
00:04
slikdigit_
left the channel
00:54
intracube
left the channel