Current Server Time: 19:49 (Central Europe)

#apertus IRC Channel Logs

2013/11/20

Timezone: UTC


00:02
Sasha_C
How are you going tonight dmj_nova?
00:03
dmj_nova
pretty good
00:03
dmj_nova
mulling over cinema camera forms
00:17
Sasha_C
Can you provide a link to any examples?
00:25
dmj_nova
not yet
00:25
dmj_nova
rexbron: has been pointing out examples of what he likes
00:28
dmj_nova
been thinking about how to incorporate those thoughts into an AXIOM camera
00:28
dmj_nova
Sasha_C: What cameras have you worked with?
00:31
Sasha_C
I've worked with Canon and Pentax Dslr's, the H16 Bolex (with reflex viewfinder) and panasonic HVX200
00:32
Sasha_C
However, since having purchased a Mamiya RB67 Pro SD, I find that I'm now spending more and more time in photography
00:34
dmj_nova
Okay
00:34
dmj_nova
I've mostly been a Canon DSLR user myself
00:36
dmj_nova
It seems that there's a bit of a divide between "box" (DSLR) camera ergonomics and "balance" (shoulder) camera ergonomics.
00:44
dmj_nova
The desire it seems is for a shoulder mount camera with good balance and very little fiddling and assembling of separate pieces on set
00:45
dmj_nova
this: http://www.aaton.com/products/film/delta/index.php
00:45
dmj_nova
not this: http://imgur.com/Rp7Z9bt
01:00
Sasha_C
In that last one, the rig is probably more expensive than the camera?
01:03
Sasha_C
From memory, one of the guys from yolk contacted us (could've been last year, or very early this year) and said they were interested in working together with their concept here: http://www.yolk.org/y2.html
02:04
dmj_nova
Sasha_C: yes it is
02:05
dmj_nova
interesting thing with the yolk bit
02:07
dmj_nova
I think I should come up with a concept focused on ergonomics and simplicity of use
02:14
tonsofpcs
left the channel
02:45
aombk
joined the channel
02:45
aombk
left the channel
02:45
aombk
joined the channel
05:00
dmj_nova
rexbron: troy_s: what is the case for left handed camera operators?
05:01
dmj_nova
do they generally operate cameras right-handed?
05:01
dmj_nova
I assume that's the case
05:16
dmj_nova
rexbron: thoughts on this camera: http://www.ikonoskop.com/begood/image_db.php?id=224&w=700&ne=1
05:16
Bertl
yes, but there are special viewfinders
05:16
Bertl
(for left 'handed' operators)
05:18
Bertl
http://irc.13thfloor.at/ALOG/2013-11/LOG_2013-11-12.txt 1384290780
05:22
dmj_nova
Bertl: thanks
05:22
Bertl
np
05:22
dmj_nova
So it seems okay to design a control scheme focused around right-handed operation
05:23
dmj_nova
it seems southpaws would have learned to work that way anyway
05:23
Bertl
I was told, that in professional environments, the cameraperson doesn't do anything but aiming the camera and controlling the focus
05:23
Bertl
everything else is basically done by the camera assistant
05:24
dmj_nova
that everything else being exposure, exchanging the magazine, and what else?
05:25
Bertl
same log, @1384290720
05:26
Bertl
I think almost everything which can be controlled on the camera is usually done by the assistant
05:26
dmj_nova
hmm
05:27
dmj_nova
Well, I suppose a shoulder-mount system can have two methods of interface:
05:27
dmj_nova
1) assistant with LCD panel on left side
05:27
Bertl
I'm pretty sure that is where indie differs from pro
05:28
dmj_nova
2) feel-controls with display in viewfinder
05:29
Bertl
I don't think that 'pro' camerawomen want to bother with anything in the viewfinder except sharpness and content (i.e. target)
05:29
dmj_nova
the latter would allow the operator to control the system without taking it off the shoulder or taking their eye from the viewfinder
05:31
dmj_nova
The same viewfinder hardware would be able to accommodate either HUD+content or simply content though
05:31
Bertl
sure, not saying that we need/want to focus on only one
05:32
dmj_nova
What we do need is specific modes of operation with sane defaults
05:32
dmj_nova
1) "indie" mode which is made for single-person operation
05:32
dmj_nova
2) "pro" mode which is tailored to 2-3 person teams
05:34
dmj_nova
and we sure as hell don't want to copy the black magic camera
05:35
Bertl
in what regard?
05:35
dmj_nova
ergonomics on that thing are...I can tell they're bad just by looking at it
05:36
dmj_nova
They couldn't even bother to conform the grip to one's hand
05:36
dmj_nova
it's a slab with a touchscreen
05:37
Bertl
maybe they had just a person with unusual hands :)
05:37
dmj_nova
you mean a person with square hands?
05:38
Bertl
yeah, maybe a stone golem :)
05:38
dmj_nova
now compare the BMCC to any Canon DSLR.
05:39
dmj_nova
Which one do you want to hold
05:39
dmj_nova
(I'm comparing to a DSLR, since it's clearly designed to be "like a DLSR but with RAW video"
05:39
Bertl
the DSLR, because it seems a lot less weight?
05:40
dmj_nova
Are you sure?
05:40
Bertl
well, probably depends on the lens system
05:40
Bertl
but it looks heavy on pictures
05:41
dmj_nova
ah, yeah it is heavier than most dslrs
05:41
troy_s
Bertl / dmj_nova Most left folks (left eyed etc) simply train for right.
05:42
troy_s
It makes sense in the case of eyepieces because some cameras don't give you an option, so you learn right.
05:42
Bertl
yes, that is how I interpreted the original comment
05:42
troy_s
Further, if operating right, the idea is to be able to open your left eye to see, so again, left eyed operators train right as convention.
05:43
troy_s
And professional work the first pulls focus, not the operator.
05:44
Bertl
btw, what is nowadays the political correct name? cameraperson or camerawoman/men or still cameraman/men?
05:44
troy_s
There is little difference on an indie; you still have someone operating (DP or an op) and a 1st to yank focus.
05:45
dmj_nova
troy_s: as far as pulling focus, how critical is latency?
05:45
troy_s
Bertl: In North America "camera operator" is pretty typical.
05:45
Bertl
ah, nice, tx
05:45
troy_s
Cameraman used to refer to the DP... now is a bit of an anachronism.
05:46
troy_s
dmj_nova: There is zero latency. If there is latency, the system is generally avoided like the plague.
05:46
dmj_nova
troy_s: thanks!
05:46
Bertl
zero noticeable latency that is
05:46
troy_s
I can think of a few instances where I have seen a system with about a two frame lag, and it is effectively impossible to operate.
05:46
dmj_nova
I assumed that it would be a big deal but had some implementation thoughts
05:47
troy_s
Even the most experienced struggle with it because you can anticipate but the disconnect causes all sorts of issues.
05:47
troy_s
General rule - there simply is zero latency.
05:47
troy_s
(on all gear including remote heads)
05:47
dmj_nova
that means you need the focus puller physically tethered and viewing a direct feet from a physical cable
05:47
Bertl
we are, to some degree, technical people, so let's settle for zero noticeable :)
05:48
troy_s
But it is a defacto norm for even the lowest budget productions to have a 1st to pull focus.
05:48
Bertl
the thing is, whatever you do, you will never get zero delay
05:48
dmj_nova
yes, we'll always be reading the previous frame
05:48
troy_s
dmj_nova: Either physical via a follow focus or ff and whip or a FIZ unit (remote Focus, Iris, Zoom)
05:48
dmj_nova
because we can't get sensor data before it has been collected and read
05:49
troy_s
(and aperture if needed for a stop yank)
05:49
troy_s
dmj_nova: The latency is about zero. Even a frame can screw you up.
05:49
Bertl
yes, and in many cases, the image acquisition will happen way faster than the output can deliver
05:50
Bertl
troy_s: it is technically impossible to get below one frame in a global shutter setup
05:50
dmj_nova
troy_s: You see what I mean about seeing the frame that was just captured rather than the frame which is being captured though, right?
05:51
troy_s
Bertl: Just saying that the systems in use are effectively perfect frame sync
05:51
dmj_nova
well, the only way to get below one frame is to use a beam splitter or two lenses
05:51
Bertl
perfect one-off frame sync systems, yes :)
05:51
troy_s
Bertl: Beyond a shadow of a doubt. Long crusades go on for remoting the live viewfinder and having zero latency.
05:52
dmj_nova
two lenses has obvious implications for framing and focus
05:52
dmj_nova
beam splitter removes half the light from the camera
05:52
troy_s
Bertl: Not sure how the Alexa or Sony's deal with it, but it is certainly not a frame out. Likely due to 180 shutter
05:53
troy_s
At 24 fps, a frame is easily noticeable.
05:53
dmj_nova
well, technically I suppose one could shoot 48fps
05:53
troy_s
And they are not out. Operators would be completely pissed.
05:53
dmj_nova
and toss half the frames
05:53
Bertl
doesn't help if your viewfinder cannot work at high framerates as well
05:53
troy_s
No. That would bugger your motion blur.
05:54
dmj_nova
but 180 shutter
05:54
troy_s
Aesthetic dictates that 24/25 is the learned baseline.
05:54
dmj_nova
so you capture 360
05:54
troy_s
180 shutter at 24.
05:54
troy_s
No cheats.
05:54
Bertl
the funny thing is, human vision cannot 'react' faster than 150ms
05:54
troy_s
1/48th
05:54
dmj_nova
180 shutter means you capture for 1/2 of each 1/24 second period
05:55
dmj_nova
each frame is a 1/48th exposure
05:55
troy_s
You realize I know this very well right?
05:55
Bertl
so everything below that value will be considered instant as long as there is no disruption in continuity
05:55
dmj_nova
during that "dark" time, you could sneak a frame in that you don't actually keep but allow the operator to see for focus
05:55
troy_s
Just saying commercial cameras capture 24 at 48th. Easily. No cheats. And not a frame out.
05:56
Bertl
which actually means that at 25FPS, we still can skip 2 frames and 'get away' with it :)
05:56
dmj_nova
(it was a question from me!)
05:56
troy_s
dmj_nova: That is sort of what I am suggesting. That is in fact precisely how older mirror systems worked.
05:56
troy_s
Bertl: You might like to think tgat
05:57
dmj_nova
Bertl: troy_s has a really good idea here
05:57
troy_s
Bertl: But I have been on sets with a single frame of latency and the operators go bat shit.
05:57
Bertl
a single _additional_ frame, yes I buy that
05:57
troy_s
Bertl: I can assure you that two frames is completely a nightmare to operate.
05:57
dmj_nova
180 degree shutter means that the sensor is only being used for 1/48th of a second to capture a frame every 1/24th of a second
05:57
troy_s
If your playback is a frame behind your remote wheels, it will drive you nuts.
05:58
troy_s
dmj_nova: Only issue is variable shutter - where you may be at a wider angle.
05:58
troy_s
Like a 270
05:58
dmj_nova
that gives us about 1/48th of a second to capture *another* frame in between the ones we actually keep
05:58
Bertl
for what purpose?
05:58
troy_s
So it isn't much of a solution.
05:59
troy_s
Bertl: Aesthetic or technical
05:59
dmj_nova
doubling our framerate for focus and halving our latency until display
05:59
Bertl
how is the second frame better or worse than the first one?
05:59
troy_s
Extremely common
05:59
troy_s
Huh?
05:59
Bertl
i.e. what is the point in taking a second frame?
05:59
dmj_nova
troy_s: wider angle...that's not a problem unless noise and dynamic range bother the focus puller
06:00
dmj_nova
the 180 degree shutter can actually be very important for 2 reasons
06:00
dmj_nova
1) it produces the expected motion blur common in cinema
06:00
troy_s
Wider is an issue if you hope to alternate for playback.
06:00
dmj_nova
2) it may be necessary to sync with flickering lights
06:01
troy_s
That is the technical reason. You may need a 144 etc.
06:01
dmj_nova
troy_s: well, I'm kinda assuming you can do some clever exposure compensation on the early frames
06:01
dmj_nova
*the between frames
06:01
troy_s
No clue how the viewfinder would work. But zero frame latency at 24 is important.
06:02
troy_s
Regardless of shutter.
06:02
dmj_nova
assuming linear response you can just multiply by the shutter ratio
06:02
dmj_nova
this will *not* work with 360 degree shutter
06:03
dmj_nova
that you're just stuck with what you've got
06:03
troy_s
I would think the image might be suck if at a 270 shutter and gained :)
06:03
dmj_nova
troy_s: yeah, there's a limit to what you can push it to
06:03
troy_s
That is a whole stop of light
06:03
dmj_nova
we can't be magic
06:04
troy_s
Anyways... I am sure Bertl can figure out some magic.
06:04
troy_s
the key is likely refresh hz.
06:04
dmj_nova
aside from the clever dark-time second exposure, we can't really do more magic
06:04
Bertl
I won't use my magic unless you explain why a second frame would help with the latency? :)
06:04
troy_s
As long as the refresh hz is faster than the dumping all is fine.
06:05
troy_s
I never said "second frame"
06:05
troy_s
I only suggested that a frame of latency would be a nightmare to operate from
06:05
dmj_nova
Bertl: So at 24fps capture, you see what's happening 1/24th of a second after it happens
06:05
Bertl
not necessarily
06:06
dmj_nova
if you have 180 degree shutter, your camera actually doesn't capture for half the time
06:06
Bertl
entirely depends on the exposure and transfer times
06:06
dmj_nova
oh, wait, I'm dumb
06:06
troy_s
Must sleep. Ciao friends.
06:06
dmj_nova
troy_s: with 180 shutter we actually could get that frame 1/48th of a second after capture
06:07
Bertl
with 0 transfertime, yes
06:07
dmj_nova
(well almost, electronics aren't magic)
06:08
dmj_nova
that extra frame would help with latency *between* frames, but not between exposure and display
06:08
dmj_nova
that's a whole different kind of responsiveness
06:08
dmj_nova
which may be helpful but certainly at a cost in battery life
06:08
Bertl
whatever you do, the transfer won't start before the exposure ends
06:09
dmj_nova
yes, exactly
06:09
Bertl
assuming that you have the same FPS on the viewfinder (which is unlikely)
06:09
dmj_nova
never claimed that :)
06:09
dmj_nova
viewfinder likely has a higher refresh rate if anything
06:09
Bertl
you could at least theoretically start displaying the frame immediately after exposure ends
06:09
Bertl
which gives you exactly one frame delay :)
06:10
Bertl
you cannot possibly get better than that
06:10
Bertl
(at least not for the entire frame :)
06:10
dmj_nova
Bertl: you're right, but you're using the wrong terms
06:10
dmj_nova
one *exposure* delay
06:10
dmj_nova
not one *frame* delay
06:10
Bertl
usually you will also have the delay from transfer and digitization
06:11
dmj_nova
exposure duration and frame duration usually aren't synonymous
06:11
Bertl
yes, but it doesn't help if you lower the exposure time, because you need to increase the display framerate for that, which in turn means that you have to display the same frame twice
06:12
Bertl
unless you actually increase the frame rate, at which point you're back to 1 frame delay :)
06:12
dmj_nova
Bertl: most LCD screens refresh at much more than 24 Hz
06:12
dmj_nova
60, 72, or 120 are common rates
06:13
dmj_nova
okay, let's have 3 terms:
06:13
Bertl
yup, so if your exposure time is 1/60th second
06:13
Bertl
and your display time is also 1/60th of a second
06:13
Bertl
then you can get away with 1/60th of a second delay (at best)
06:13
Bertl
usually you will have 1/30th of a second delay
06:14
Bertl
which is still higher than the 24FPS you might be recording
06:15
dmj_nova
1) exposure duration = time global shutter collects light for each frame (ex 1/48th)
06:15
dmj_nova
2) frame rate = number of frames per second that are captured and permanently stored (1/24th)
06:15
dmj_nova
3) monitor refresh rate = framerate of the display used to monitor footage (1/60th)
06:15
dmj_nova
Bertl: yes
06:16
Bertl
you can add a fourth one to that, the rate the viewfinder image is transmitted with
06:16
dmj_nova
also: you can totally manipulate such a system to get you closer to 1/60th of a second
06:17
Bertl
because even if the viewfinder has 120Hz, it doesn't help if the data stream to the recording medium is used, for example
06:17
dmj_nova
Bertl: true
06:17
dmj_nova
I was simplifying and assuming we weren't deliberately being dumb :)
06:18
Bertl
has nothing to do with dumb actually
06:18
dmj_nova
so we would be matching the two
06:18
dmj_nova
didn't mean to insult :)
06:18
Bertl
I've heard from a number of folks that sometimes you want to see the data being recorded and not some idealized view
06:18
dmj_nova
but yes, "display output rate" might have been better
06:19
Bertl
don't forget that motion and motion blur works completely different at higher rates
06:19
dmj_nova
Bertl: that's only with motion tweening
06:20
dmj_nova
Not with simply refreshing the same image three times
06:21
dmj_nova
which incidentally is done already with modern screens and projectors to compensate for flicker headaches even with 24 fps
06:21
dmj_nova
so in the theatre they show you each 1/24th of a second frame twice
06:22
dmj_nova
it doesn't change the motion blur properties, but it makes it so you can't "feel" the flicker
06:22
dmj_nova
and I may have just explained to myself why 180 degree shutter is prized for realistic motion blur
06:23
dmj_nova
or that could be a coincidence
06:23
dmj_nova
Bertl: or are you talking about something else?
06:24
Bertl
well, modern screens/projectors interpolate the missing frames
06:24
Bertl
because the information is already available in the codecs
06:24
Bertl
i.e. motion estimation and similar stuff
06:25
Bertl
fact is, that if your exposure time is lower than the frame rate, you will get 'missing' gaps in motion blur
06:25
Bertl
you can compensate for those by e.g. taking two frames and combining them
06:26
dmj_nova
Bertl: yes, but that's a terrible horrible thing that looks awful and needs to stop
06:26
Bertl
which would allow you to display each frame in a view finder 'earlier' than the combined frame
06:26
Bertl
well, it won't be different from a longer exposure time :)
06:26
dmj_nova
Bertl: it will be noisier
06:27
Bertl
actually it will have less noise
06:27
dmj_nova
not necessarily by a lot, but it will be
06:27
dmj_nova
isn't there noise associated with each readout?
06:27
Bertl
okay, yes, if you consider the bad quality of sensors, then yeah
06:28
Bertl
the readout is probably the noisiest part, but it will not increase
06:28
dmj_nova
well, you have double readout noise
06:28
dmj_nova
some of that will cancel, but not all
06:28
Bertl
i.e. if the readout causes a 5% noise, the combination will have 10% noise of the original value, but twice the intensity
06:29
Bertl
so summing and dividing by 2 (aka averaging) will give 5% noise
06:29
dmj_nova
no, not twice the intensity
06:29
Bertl
that is if the noise is not a gaussian noise
06:29
Bertl
in which case, the noise will actually get less by averaging
06:29
dmj_nova
half the exposure => half the light collected
06:31
Bertl
well, how does the readout noise figure into that then?
06:31
Bertl
will it be the same or less or more?
06:32
Bertl
it is hard to quantify and generalize
06:32
dmj_nova
you have two sets of readout noise
06:32
Bertl
I'm pretty sure in certain cases the noise will be better and in other cases worse, around the typical cases it will be about the same
06:32
dmj_nova
2*noise + 1/2*light + 1/2*light
06:33
dmj_nova
= 2*noise + light
06:33
Bertl
so you want to only use half the ADC range for 1/2 the light?
06:33
Bertl
i.e. sample a 'darker' picture?
06:34
Bertl
usually you will adjust the ADC gain to give a full range image, no?
06:34
dmj_nova
whereas with one exposure you get: noise + light
06:34
dmj_nova
yes, but increasing gain means increasing noise in that capture
06:35
Bertl
and on the other hand, the termal noise gets less when reduced exposure
06:35
Bertl
*thermal
06:35
dmj_nova
so the averaging will negate the increased noise from the gain, but not from the sampling process itself
06:35
dmj_nova
Bertl: what about the thermal noise?
06:36
Bertl
it will get larger with longer exposure times
06:36
dmj_nova
but your two exposures is still exposing for the same amount of time
06:37
Bertl
but with resets inbetween
06:37
dmj_nova
very brief ones
06:37
Bertl
i.e. the photosites will drift away from the optimal discharge path
06:38
Bertl
if you reset more often, you get less noise
06:38
jucar1
left the channel
06:38
dmj_nova
Bertl: did not know that
06:38
dmj_nova
Bertl: you could test this
06:38
Bertl
we will probably test this and more
06:38
dmj_nova
how many exposures can you capture now?
06:39
Bertl
depends on the area
06:39
jucar
joined the channel
06:39
Bertl
for a small area (i.e. number of lines) we can probably do about 200 or more
06:39
dmj_nova
try your trick with as many consecutive exposures as possible and an equivalent single exposure
06:40
dmj_nova
see what the relative noise you get is
06:40
Bertl
as I said, a lot will be tested, currently the focus is on getting the alpha prototype done
06:40
dmj_nova
fair enough
06:41
dmj_nova
I've also done some work on temporal denoising
06:42
dmj_nova
have some blender compositing setups that are pretty good
06:43
dmj_nova
also: dark noise always adds, never subtracts
06:44
dmj_nova
stray electrons never lead to values darker than pure black
06:45
Bertl
interesting that you say that, do you know how a CMOS sensor works?
06:45
dmj_nova
the basics
06:46
Bertl
the photosites are precharged to a certain voltage, and light 'discharges' them over time
06:46
Bertl
now it really depends on where those stray electrons are
06:47
dmj_nova
you mean wrt the band gap?
06:47
Bertl
http://www-isl.stanford.edu/~abbas/group/papers_and_pub/hui_thesis.pdf
06:48
Bertl
here is a nice read if you're interested in
06:50
dmj_nova
oh, this is fascinating
06:55
dmj_nova
hmm...it occurs to me that I'd actually never looked up what a full CMOS pixel looked like
06:55
dmj_nova
just dealt with photodiode operation
07:09
dmj_nova
Bertl: okay, this is a lot to digest
07:10
Bertl
have fun digesting :)
07:10
Bertl
I'm off to bed now ... have a good one everyone!
07:22
Sasha_C
left the channel
08:18
Sasha_C
joined the channel
09:09
PhilippeJ
joined the channel
09:53
se6astian
joined the channel
10:20
PhilippeJ
Hello !
10:25
PhilippeJ
Bertl, troy_s, dmj_nova just read the backlog about the delay in the pipeline. There *is* some delay between reality and what is shown on the display, in any camera. Altough this delay is as minimal as possible. It is hard to say how much is too much, less than a frame means less than 41 milliseconds. I guess we can achieve much better, like 10 msec delay. Which I'd translate as humanly not perceivable.
10:29
PhilippeJ
(not counting exposure time btw)
10:30
PhilippeJ
Interesting summary : http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5540888&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5540888
10:53
PhilippeJ
and another one, focused on video games, an area where latency is studied quite well : http://enterthesingularity.blogspot.be/2010/04/latency-human-response-time-in-current.html
10:53
PhilippeJ
experimental results : 50-60ms - absolute limit of delay detection - small object tracking - mouse cursors
10:54
PhilippeJ
which is our famous 41 msec + 10 msecs processing :-)
11:41
se6astian
Can you add that to the wiki
11:41
se6astian
I fear more and more that most of the information generated in here is just lost because its not recorded anywhere
11:44
PhilippeJ
yep
11:45
PhilippeJ
I can't edit the wiki homepage with my philippejadin account ?
11:45
se6astian
hmm, interesting
11:45
se6astian
did you fill the captcha?
11:46
PhilippeJ
I didn't have any captha to filll
11:46
PhilippeJ
there is no edit tab even when I'm logged
11:46
se6astian
ok, let me take a look
11:47
PhilippeJ
only on the homepage btw
11:47
PhilippeJ
I'm member of Autoconfirmed users, Users
11:49
se6astian
try now please
11:50
PhilippeJ
still no edit tab :-(
11:50
PhilippeJ
I have a view sourc einstead
11:50
se6astian
can you edit the sandbox?
11:50
se6astian
https://wiki.apertus.org/index.php?title=Sandbox
11:50
se6astian
I think the main page is protected
11:50
se6astian
let me make you admin
11:51
PhilippeJ
well in fact
11:51
PhilippeJ
I can edit the alpha prototype page
11:51
PhilippeJ
but it's getting long, and the info I'm going to put is more generic
11:51
se6astian
unprotected main page
11:52
PhilippeJ
works
11:52
se6astian
great
11:52
se6astian
lets see if spam increases as well now
11:53
se6astian
the wiki is only mollom protected now
11:53
se6astian
even anonymous can edit
11:53
PhilippeJ
ah well if you want to can add only registered users, who knows
11:53
se6astian
I read up on it
11:53
se6astian
and they recommend it that way
11:54
se6astian
because bots are used to registring user names
11:54
se6astian
but no bot tries to edit the page as anonymous :)
12:00
PhilippeJ
posted here : https://wiki.apertus.org/index.php?title=Image_latency
12:00
PhilippeJ
I didn't summarize the whole discussion since I wasnt present, if anyone has time, feel free
12:00
se6astian
great, thanks
12:01
se6astian
there was also some discussion about ergonomics yesterday which I would love to have summarized
12:02
se6astian
when you say less than 41ms
12:02
se6astian
that refers to 24 FPS I assume?
12:02
se6astian
41.666ms ?
12:05
se6astian
ah yes its noted further down
12:29
intracube
joined the channel
12:42
PhilippeJ
I refered to the expected irc log url, altough, it's curently empty :-)
12:53
se6astian
should be filled soon :)
13:01
dmj_nova1
joined the channel
13:02
PhilippeJ
hello dmj_nova1, feel free to fill this with more info : https://wiki.apertus.org/index.php?title=Image_latency
13:02
PhilippeJ
is heading to home, see you soon!
13:03
dmj_nova
left the channel
13:04
PhilippeJ
left the channel
13:12
FergusL
left the channel
13:13
FergusL
joined the channel
13:21
leo_m
joined the channel
14:01
philippej
joined the channel
14:12
aombk
left the channel
14:19
Bertl
morning everyone!
14:24
se6astian
hello Bertl!
14:29
philippej
heya !
14:55
Sasha_C
left the channel
15:00
philippej
se6astian, anything we might look at ?
15:02
se6astian
look at?
15:04
philippej
work on :-)
15:05
se6astian
the website page explaining new members the working conditions maybe?
15:05
philippej
yep
15:06
se6astian
maybe here: https://www.apertus.org/node/120
15:06
se6astian
that page is linked to from: https://www.apertus.org/contribute
15:07
se6astian
but we might need to make it more prominent
15:07
philippej
yep
15:08
philippej
we should add your now stadard phrase of "join us on irc, introduce yourself and ask for work :-)"
15:08
philippej
a little google doc ?
15:08
se6astian
good idea
15:08
se6astian
yes
15:10
philippej
let me create it
15:11
philippej
https://docs.google.com/document/d/13H9ghagelbw41WHH4J6dIyP862TC3VriXyNsE0QH3gE/edit?usp=sharing
15:14
se6astian
I cant edit yet
15:15
philippej
no you can :-)
15:15
philippej
(now)
15:18
Bertl
philippej: you want input?
15:18
philippej
Bertl, always welcome, either here or directly on the doc
15:18
Bertl
no write access either
15:19
philippej
it should be fixed, maybe reload the page
15:23
[1]leo_m
joined the channel
15:25
leo_m
left the channel
15:25
[1]leo_m
changed nick to: leo_m
15:36
Sasha_C
joined the channel
15:36
Sasha_C
left the channel
15:51
troy_s
Hello all.
15:52
Bertl
hello troy_s! how's going?
15:55
troy_s
dmj_nova1: “realistic motion blur” = misnomer. All aesthetics are emergent phenomena. Even a basic photograph is a convention. Net sum is that 24fps merely is the learned aesthetic baseline.
15:57
troy_s
philippej: I can't offer any insight other than the fact that at 24fps, the display in viewfinder, onboard, or remote must be frame accurate. A single frame behind will cause fits and bring much derision. :)
15:58
philippej
troy_s, I was thinking that to measure the delay between reality and what's on the viewfinder, we could film existing cameras (at somehow high framerate) with both a clap and the viewfinder in frame, to guesstimate the delay they have.
15:58
troy_s
Bertl: Good thanks. You sir?
15:59
troy_s
philippej: I am sure that documentation must be online somewhere.
15:59
Bertl
everything fine here as well ...
15:59
troy_s
philippej: Certainly for the Alexa or F65 etc.
16:00
philippej
troy_s, let me dig this
16:00
troy_s
philippej: I suspect it is in your estimated ballpark. 50ms or so?
16:01
philippej
for the alexa, they say, less than one frame, but it's for the viewfinder
16:01
se6astian
ok, google doc looks fine
16:01
troy_s
philippej: I do know for certain that gobs of money is spent in remotely transmitting HD signals with extremely low latency for remote head applications and operation.
16:01
se6astian
I will ask sasha to look over it and then edit the current page content to replace it with the new
16:02
philippej
troy_s, definitely, maybe I misread your conversation, I thought you were speaking in the latency between reality and the viewfinder
16:02
troy_s
philippej: EG http://www.teradek.com/pages/bolt
16:02
troy_s
philippej: That too
16:02
philippej
I think we met those guys at ibc
16:03
troy_s
philippej: I only illustrated the latency concern with remote views. Viewfinder is equally important, and is likely as-good-as-it-gets.
16:04
philippej
as I see it, in axiom, all the processing from sensor to sdi (for example) will be done in the fpga, so it will be as good as it gets
16:05
philippej
I remember when we visited intopix stand at ibc (they do compression), Guerric told me that their compression scheme added "only a few lines" of latency
16:05
troy_s
philippej: I only voiced concern when I was reading "only a frame of latency" which in camera work, is unacceptable.
16:06
philippej
each processing step adds some delay, depending on the complexity of the algorythm involved, or the required data (for example if you need 10 lines for a debayer algorithm, obviously you introduce at least those 10 lines delay)
16:06
philippej
I guess Bertl have some idea of the expected latency between reality and sdi out
16:08
troy_s
Onboard could likely be sensel skip for debayer.
16:08
troy_s
The SDI out "raw" (debayered) would likely need to be higher quality, and "zero latency" goals are less important.
16:09
troy_s
(roll it through a cheap GPU shader onboard even?)
16:09
Bertl
well, in the optimal case we have the exposure time + FOT + readout time
16:09
troy_s
FOT?
16:10
troy_s
I wonder if gcolburn made way with the profiling.
16:10
Bertl
frame overhead time
16:10
Bertl
i.e. the time it takes to copy and digitize the samples
16:15
troy_s
Gotcha
16:17
Bertl
which will be at least 1, probably more like 2 'exposure times' as we decided to call it
16:17
Bertl
(for short exposures it might be quite some more)
16:24
se6astian
time to leave the office
16:24
se6astian
see you later
16:24
se6astian
left the channel
16:48
philippej
left the channel
16:49
philippej
joined the channel
17:40
se6astian
joined the channel
18:12
[1]leo_m
joined the channel
18:12
leo_m
left the channel
18:12
[1]leo_m
changed nick to: leo_m
18:22
philippej
left the channel
18:22
se6astian
more progress with irc channel logs: https://www.apertus.org/irc/index.php
19:37
intracube
changed nick to: intracube_afk
20:30
intracube_afk
left the channel
22:19
se6astian
bedtime :)
22:19
se6astian
nighty
22:20
se6astian
left the channel
22:20
jucar
left the channel
22:26
jucar
joined the channel
22:36
rexbron
left the channel
22:36
rexbron
joined the channel
22:36
rexbron
left the channel
22:36
rexbron
joined the channel
22:38
intracube
joined the channel
23:28
jucar
left the channel