Current Server Time: 21:03 (Central Europe)

#apertus IRC Channel Logs

2015/08/10

Timezone: UTC


00:28
intracube
left the channel
00:31
fsteinel
left the channel
00:31
fsteinel_
joined the channel
01:00
g3gg0
joined the channel
04:43
jucar
joined the channel
04:52
jucar
left the channel
04:53
jucar
joined the channel
06:17
danieel
left the channel
06:32
tezburma
joined the channel
06:43
danieel
joined the channel
06:48
se6astian|away
changed nick to: se6astian
06:50
se6astian
good morning
07:02
aombk
joined the channel
07:04
aombk2
left the channel
07:21
cbohnens|away
changed nick to: cbohnens
07:21
tezburma
left the channel
07:27
g3gg0
left the channel
07:34
tezburma
joined the channel
08:04
comradekingu
left the channel
08:24
Bertl_oO
changed nick to: Bertl
08:24
Bertl
morning folks!
09:07
jucar
left the channel
09:08
fsteinel_
changed nick to: fsteinel
09:18
jucar
joined the channel
09:46
jucar
left the channel
10:38
Bertl
changed nick to: Bertl_oO
11:49
jucar
joined the channel
12:27
jucar
left the channel
12:29
jucar
joined the channel
12:42
jucar
left the channel
15:02
se6astian
changed nick to: se6astian|away
15:08
intracube
joined the channel
16:30
antiatom
troy_s: Interesting take on NLE's
16:30
antiatom
21:05 < troy_s> The solution is very much possible, but the larger issue is that a non-linear editor shouldn't be doing that sort of thing.
16:30
antiatom
21:05 < troy_s> It should only be providing the 'blueprint' on how it should be done, then another piece of software should be doing the actual work.
16:30
antiatom
21:05 < troy_s> The solution is very much possible, but the larger issue is that a non-linear editor shouldn't be doing that sort of thing.
16:30
antiatom
21:05 < troy_s> It should only be providing the 'blueprint' on how it should be done, then another piece of software should be doing the actual work.
16:30
antiatom
21:05 < troy_s> The solution is very much possible, but the larger issue is that a non-linear editor shouldn't be doing that sort of thing.
16:30
antiatom
21:05 < troy_s> It should only be providing the 'blueprint' on how it should be done, then another piece of software should be doing the actual work.
16:30
antiatom
21:05 < troy_s> The solution is very much possible, but the larger issue is that a non-linear editor shouldn't be doing that sort of thing.
16:30
antiatom
21:05 < troy_s> It should only be providing the 'blueprint' on how it should be done, then another piece of software should be doing the actual work.
16:30
antiatom
SORRY for that spam.
16:32
antiatom
But this reminds me of Kdenlive a bit, where the project file is just XML instructions that are handed over to MLT to affect the multimedia streams with the help of frei0r-plugins and finally ffmpeg for transcoding
16:32
antiatom
I need to learn more about the things you are talking about... of course I am limited extremely how much I can grade because I still shoot in crappy Canon H.264 8-bit 420
16:33
antiatom
But this interests me greatly... I am not sure if I should TRY software like Vegas, AVID, or Premier just to see what I am missing, or stick with Kdenlive
16:33
antiatom
Maybe I would understand the problems more if I tried the non-libre apps?
16:33
antiatom
What do you think troy_s ?
16:42
troy_s
antiatom: FCP, Mediacomposer, and Lightworks, etc. all are capable as offline NLEs.
16:43
troy_s
antiatom: Basically the idea of an interchange and the cross compatibility thereof. FCPXML had a good thing going until they went ecosystem focused, but the schema is pretty solid from the FCP years.
16:44
troy_s
(Pre FCPX, included specifications for splines, effects blocking, etc. Useful marking / blueprinting for visual effects, post, etc.)
16:45
troy_s
antiatom: Once anyone tries to take a high quality project through a post pipeline, the offline system makes perfect sense.
16:46
troy_s
antiatom: Even your 420 footage can be decoded to a more decent entry point. Use a 32 bit float buffer and scale the Cb / Cr manually in that buffer.
16:47
troy_s
To think of it another way, over three quarters of your color information is subsampled, which means three quarters of your image quality is rounded off poorly if you decode it in an 8 bit reference buffer.
16:49
g3gg0
joined the channel
17:01
se6astian|away
changed nick to: se6astian
17:09
se6astian
changed nick to: se6astian|away
17:49
slikdigit
joined the channel
18:16
se6astian|away
changed nick to: se6astian
18:28
irieger
joined the channel
18:29
irieger
left the channel
18:29
irieger
joined the channel
18:31
irieger
Hey Bertl_oO
18:34
Bertl_oO
hey irieger!
18:35
Bertl_oO
you're the color research guy, yes?
18:39
irieger
yep, I’m that guy ;-)
18:39
irieger
I assume you got my mail too as I sent it to the team mail?
18:40
Bertl_oO
yes, I got it, very interesting
18:41
Bertl_oO
our local color expert is troy_s, you might be interested to chat with him about color spaces and how bad libre software is :)
18:41
irieger
For everyone else here a short description what we are talking about to not leave you out here: I’m a student and have to write my bachelor thesis next semester. Since I started to dive deep into color science I thought maybe I could work on the Axiom color science to have a cool topic and a realworld use cases people can benefit of.
18:42
irieger
Ok, cool to know. Hey troy_s.
18:42
Bertl_oO
technical aspects regarding Axiom Beta should go in my general direction, hardware and software for now
18:42
irieger
Color science is not only a problem in the libre software world …
18:43
Bertl_oO
you can read up on the IRC chat logs (the channel is logged in realtime) if you like
18:43
irieger
I already had a short look into the looks but not too deep.
18:44
irieger
Before I dive to deep into reading can I ask you something directly?
18:44
Bertl_oO
sure, go ahead
18:45
Bertl_oO
that's what this channel is for ... i.e. just ask, don't ask to ask :)
18:45
irieger
When do you expect to have prototypes of the first stage that are hardware ready?
18:45
Bertl_oO
we already showcased the first working hardware at NAB and lateron on linuxwochen in Vienna
18:45
Bertl_oO
we are currently finalizing the hardware for the Early Betas
18:46
irieger
I know. But as far as I followed everything it was the one first prototype with a defect sensor which was used just for general testing. I mean the early beta stage which should be mostly finalized hardware in regards of the sensor or am I wrong?
18:47
Bertl_oO
yup, the Sensor Frontend is not expected to change after the Early Beta
18:48
Bertl_oO
if everything goes as expected, we will have the first Early Betas end of this month
19:07
se6astian
hi irieger
19:07
se6astian
I am just reading your email replyatm
19:07
irieger
Please excuse my bad response time. I just came home about a hour ago and I’m currently “compiling” a dinner and with oily fingers going to the laptop is a bad Idea …
19:08
se6astian
just finish that oily phase :)
19:08
se6astian
*I just finished that oily phase
19:08
Bertl_oO
and I think I'm going make myself something to eat now ... :)
19:12
se6astian
bon appetit
19:13
troy_s
irieger: Greets. No real expert, just been dealing with it for a good few years now.
19:14
troy_s
irieger: learned much with the first prototype and how horrific the actual sensors are. Bertl_oO was very helpful on that front.
19:15
troy_s
irieger: My current thinking is to work out a pipeline for all of a typical cinematic post production pipe; no bending / cheating of non-data into data etc.
19:15
troy_s
A very by-the-book set of colour transforms.
19:16
troy_s
(Suitable for post production work such as CGI and compositing etc.)
19:21
tezburma
left the channel
19:21
slikdigit
left the channel
19:30
slikdigit
joined the channel
19:54
irieger
Bertl_oO: nice to hear that the hardware is making progress
19:55
Bertl_oO
we have been making a lot of progress hardware wise in the last year
19:56
irieger
troy_s: Is it really that horrible? I have only experience with the Cmosis in the Blackmagic Production Camera 4K which has a real good color if you don’t rely on the Blackmagic Log->Rec709 LUT
19:56
irieger
if you go the ACES route the colors are really nice
19:58
Bertl_oO
ACES being?
19:59
irieger
http://www.oscars.org/science-technology/sci-tech-projects/aces
20:01
irieger
It’s a color space intended as a VFX/grading color space. A linear color space covering all the visible color spectrum to cover all cases. The future for post production
20:02
irieger
And the most important thing: A public standard to ease work between postproduction studios which often had (or still have) their own color science which doesn’t help when colaborating.
20:03
se6astian
Bertl_oO: remember we were supposed to get an ACES cap at the colorist mixer, but the next day at NAB they had handed all out already....
20:03
irieger
Besides being a color space it’s also a toolset to convert to and from this colorspace. There are transforms for all major cameras to get their files into ACES and have a set of output transforms for various display spaces.
20:04
irieger
Best thing on the output side: The RRT (reference rendering transform), a simulation of film color/light handling with nice highlight roll-off etc.
20:05
Bertl_oO
is this an open standard? can you point me to the documentation?
20:05
irieger
Code is on https://github.com/ampas
20:06
irieger
aces-dev is the official color transformation archive. CTL is the reference implementation for the color transformation language used for the transforms
20:07
irieger
https://github.com/ampas/aces-dev/tree/master/documents contains a link to the documents. They are also describing methods to characterize cameras to convert to aces etc.
20:08
irieger
ACES is a cool toolset but sadly to few software supports it really good
20:09
Bertl_oO
understood. thanks!
20:10
irieger
troy_s: what are your thoughts on the pipeline front?
20:10
irieger
troy_s: Btw. have you seen my email?
20:23
troy_s
irieger: ACES isn't ideal for CG. The more limited gamut ACEScg is better.
20:23
troy_s
irieger: I haven't seen your mail.
20:24
irieger
sent it to you as a message on the lab.
20:25
irieger
haven’t really had experience with CG and ACEScg. Just used aces for some color grading and like it very much in general. Some good concepts behing it.
20:26
troy_s
(Little known factoid is that tristimulus colour models all behave differently when performing physical based manipulations.)
20:26
troy_s
irieger: Will look.
20:26
troy_s
I don't check there too often, and email notifications apparently don't work.
20:26
troy_s
The only thing ACES really does is standardize the pipr
20:26
troy_s
Pipe
20:26
troy_s
No magic.
20:26
Bertl_oO
troy_s: seem to work fine here (email notifications)
20:27
troy_s
It is wide enough gamut to cover all cameras, so it is great from an archival perspective.
20:27
troy_s
You still need the native camera space for things like keying to avoid crosstalk etc.
20:27
troy_s
Bertl_oO: Odd.
20:27
irieger
I also had no problem with notification yet. The only thing aces does is offering a standard way to work :D And a nice color rendering with the film like reference rendering
20:28
troy_s
Sure. ACES is definitely viable for some LUTs of course.
20:28
troy_s
My bigger worry is trying to get a very good set of LUTs for the entire dynamic range.
20:28
irieger
I don’t want to show aces like a solution for everything. But it’s a nice toolset. Not fit for everything
20:30
troy_s
It is great that someone has dealt with it.
20:30
troy_s
Smart cookies involved with it.
20:35
irieger
I even thought that it might be nice to have different spaces for recording. A nice log format and maybe something else. There are some new ideas how to best store camera data trying other concepts. I’m trying to find more to read about it in the coming weeks. Idea is: If you are ok with storing data in a way not really usable for direct grading you can do some nice stuff I heard. Have to use aces or a lut to get to a grading space maybe
20:35
irieger
I like to have options. Most commercial companies offer to few choices in some regards …
20:37
irieger
And getting some nice luts is manageable if you have a good starting point
20:38
troy_s
irieger: Looks like a great project.
20:38
troy_s
irieger: The camera data storage is somewhat irrelevant as it always wants to store the entire dataset as it is best rendered on the camera.
20:39
troy_s
There is a piecewise linear set of knees that can be coded into the pipeline that might allow us to get a rough (very rough) loglike response for some perceptual compression.
20:40
troy_s
irieger: Having a handle on a typical post production pipe also helps on the colour front
20:41
troy_s
irieger: Have you read Mr. Selan's document at cinematiccolor.com? It is the Visual Effects Society approved paper now.
20:42
troy_s
It highlights the difference between display referred and scene referred models, as well as touches on the why a little.
20:48
irieger
Having it on my harddisk for some weeks now but have only had a short look. Had to finish some stuff from semester the last months that took away some of my focus lately.
20:49
troy_s
irieger: Worth a read.
20:49
troy_s
Anyways, hit me via email
20:49
troy_s
Easier for me to reply and such
20:49
irieger
Shure. thanks for reminding me on this paper
20:50
irieger
what’s your email address?
20:50
troy_s
You have it via PM.
21:13
se6astian
gotta go
21:13
se6astian
time for bed :)
21:13
se6astian
good night
21:14
irieger
Good night
21:14
se6astian
changed nick to: se6astian|away
21:27
antiatom
https://github.com/ampas/CLF/blob/master/LICENSE.md
21:50
Bertl_oO
tx
23:01
irieger
have a good night. I’m off for today
23:01
irieger
left the channel
23:18
intracube
changed nick to: intracube|away
23:20
g3gg0
left the channel