Current Server Time: 23:35 (Central Europe)

#apertus IRC Channel Logs

2018/04/30

Timezone: UTC


02:31
rton
left the channel
02:57
supragya
joined the channel
03:02
Bertl_oO
off to bed now ... night everyone!
03:02
Bertl_oO
changed nick to: Bertl_zZ
04:25
futarisIRCcloud
joined the channel
04:53
illwieckz
left the channel
05:00
illwieckz
joined the channel
05:12
supragya
left the channel
05:58
supragya
joined the channel
07:03
supragya
left the channel
08:32
se6astian|away
changed nick to: se6astian
08:57
niemand
joined the channel
08:57
niemand
left the channel
08:57
niemand
joined the channel
09:01
Bertl_zZ
changed nick to: Bertl
09:01
Bertl
morning folks!
09:21
g3gg0
joined the channel
09:21
g3gg0
hi
09:24
se6astian
good day
09:35
TycoKiaz
joined the channel
09:40
TycoKiaz
left the channel
09:41
rton
joined the channel
10:00
ArunM
joined the channel
10:05
futarisIRCcloud
left the channel
10:21
RexOrCine|away
changed nick to: RexOrCine
10:38
niemand
left the channel
10:50
niemand
joined the channel
11:13
intrac
left the channel
11:24
supragya
joined the channel
11:24
supragya
hi @ g3gg0
11:28
supragya
*sorry, wasn't available 2 hours ago, read the mail just now.
11:30
supragya
se6astian: Is dual ISO something that Apertus plans to implement in AXIOM in foreseeable future? For HDR?
11:36
danieeel
that depends whether sensor has dual-iso and the CMV12000 has not
11:36
danieeel
but it has other ways to achieve HDR
11:36
supragya
such as ..
11:37
danieeel
piecewise linear response curve
11:37
danieeel
changed nick to: danieel
11:40
Bertl
the CMV12k can do different exposures on even and odd rows
11:41
danieel
has that some name?
11:41
danieel
dual-expo? :)
11:41
supragya
I was wondering the same :)
11:41
supragya
http://acoutts.com/a1ex/dual_iso.pdf
11:42
danieel
dual iso is when you modify the conversion gain, not the exposure, otherwise you get some nasty artifacts
11:49
Bertl
IMHO the piecewise linear HDR is more powerful anyway ...
11:49
Bertl
off for now ... bbl
11:49
Bertl
changed nick to: Bertl_oO
11:52
Bertl_oO
and I guess the name would be dual-exposure, no? :)
11:55
supragya
I was wondering if there will be cases when two cameras' output needs to be interlaced onto a single file... 1 thought ran up my mind was a 3D rig... Any other reasons why one may interlace these into 2 STREAMS? Is it meaningful to have 2 STREAMS of video on single file in 3D rig?
11:55
supragya
interlaced as in... into separate streams
11:57
danieel
i would assume it makes sense to have it in 1 file only as in delivery format, not much as the raw capture
11:57
danieel
usual post works in a way you throw into it a lot of input footage and it produces something meaningful
11:58
danieel
having to select substreams is extra work :)
11:58
supragya
can you explain this? would you like for this to have in camera sync if it were two streams in single file? I think this would make sense
11:58
danieel
i do multicam VR / LF and sources are better kepts separate
11:59
danieel
camera sync is about timing of sensors, completely unrelated to muxing two video tracks into a single file
12:00
g3gg0
hi. back from dinner
12:00
supragya
sync as in ordering and timing of two streams with respect to each other
12:01
supragya
hi g3gg0 !
12:04
g3gg0
dual-iso: exactly. the big plus is that the interlaced lines have the same exposure time, preventing some tearing artefacts
12:05
g3gg0
still it was a technical solution for a problem that maybe apertucs cams initally dont have
12:05
g3gg0
e.g. if you can use non-linear sensors
12:06
supragya
dual expo has varied time under exposure for odd and even lines?
12:06
supragya
that would create tearing... more of ghosting artifact no?
12:07
niemand
left the channel
12:07
g3gg0
well maybe thats the better technical term, syes
12:07
g3gg0
*yes
12:07
g3gg0
varying exposure time is good for still photography
12:08
supragya
but not for vids... I guess HDR is not for that anyways! does any of these - PLR require additional support in file format?
12:08
g3gg0
i am not aware of all technical solutions out there. but dual iso was a very fitting one
12:09
g3gg0
PLR?
12:09
supragya
Piecewise linear response (https://wiki.apertus.org/index.php/PLR)
12:09
g3gg0
syncinc cameras. yeah, 3D. a friend of mine built those systems for large productions
12:09
g3gg0
you have to be very exact when aligning and syncing those 3D cameras
12:10
g3gg0
even the slightest error can cause trouble for the viewers in cinemas even if you cannot really measure any difference
12:11
g3gg0
ah, PLR ok. dont know if we have to transport additional data from the camera sensor to the developing application - do we?
12:11
supragya
If odds are on different exposure level than even lines, would it make sense to capture the two settings into an additional header?
12:12
g3gg0
*if* thats done, you can define your own blocks for that
12:12
supragya
a newer version of MLV you say?
12:13
g3gg0
hehe maybe its not clear. MLV's structure does not restrict the number and types of blocks that you use
12:13
supragya
I know that from day 1 :)
12:13
supragya
skip what you don't understand - for readers
12:13
g3gg0
so adding a block doesnt make a new MLV version
12:14
supragya
hmm... okay
12:14
supragya
what makes a new version then?
12:14
g3gg0
incompatible changes to the file format
12:15
g3gg0
when e.g. a VIDF suddenly has a 64 bit field for uhhh frame number ;)
12:15
supragya
g3gg0: can I ask the thinking that went behind v2.1
12:15
supragya
how it is to be used?
12:16
g3gg0
you mean the field in header MLVI ?
12:16
supragya
mlv_subc_hdr_t
12:16
g3gg0
ah okay sure
12:17
supragya
nice kid photo on G+ :), your son?
12:17
g3gg0
the question was (a year or so ago) how MLV could handle multiple camera video streams
12:17
g3gg0
oh eh yeah
12:17
g3gg0
the one with my first digital camera? 4 years ago or so
12:17
supragya
:)
12:18
supragya
multiple camera video streams <- In which cases would you need it? Since for example danieel suggests that he may want two streams separate in files
12:18
supragya
in two files I mean
12:18
danieel
PLR in file format: add a LinearizatioTable (reverse LUT) as in DNG
12:20
g3gg0
okay one question - is the stream to be made by *one* processor who merges the streams and writes the files?
12:20
supragya
I think, no
12:20
g3gg0
or is the rig a genlock'd one with two cameras?
12:20
supragya
the second one
12:21
g3gg0
then its as easy as this: genlock the two cameras, hardsync the record start and write two separate files (chunked is also okay)
12:22
supragya
genlock?
12:22
g3gg0
locking the two system clocks
12:22
supragya
got it
12:22
g3gg0
so they clock at the same rate -> same FPS, same timebase
12:22
g3gg0
in MLV every block has its timestamp relative to "recording start"
12:23
supragya
if that is sorted, why would you need streams? couldn't figure any other case
12:23
g3gg0
it is stored in microseconds
12:23
supragya
since only one sensor per camera can give one stream
12:23
g3gg0
can you rephrase the question, i dont understand it
12:23
g3gg0
what is sorted and which streams do you mean
12:24
supragya
why would you need mulitple streams in a video file ?
12:24
g3gg0
ah okay for use cases where one camera has more sensors
12:24
g3gg0
there was some time ago the theoretical use case of a single camera with two sensors iirc
12:25
supragya
o.O
12:25
g3gg0
maybe for 3D or any other use case
12:25
g3gg0
and i was asked if MLV could handle that. i said that it would be not hard to add "subchannels"
12:25
g3gg0
also for audio
12:26
supragya
It makes sense to keep the RAW O/P separate into files that are genlock(ed) as you said... was having a single CPU a consideration for multi stream vs multi file
12:26
g3gg0
no idea if anyone has this goal, but would be simple to add
12:27
g3gg0
well, they can get merged later to have one file per scene
12:27
supragya
so in conclusion, am I wrong in saying that there is a limited use case of adding multiple video streams in a single file
12:29
g3gg0
i dont know of any highly critical workflow that requires one file with separate streams.
12:29
g3gg0
that doesnt speak of everyone involved :)
12:29
g3gg0
*for
12:29
danieel
the multistream file is useful as an editing format (multiple video layers)
12:30
g3gg0
for one scene or for 3D?
12:30
supragya
could you elaborate this use case?
12:30
g3gg0
or both? :D
12:30
danieel
directly camera related could be to have an OSD in separate video track
12:30
danieel
some cameras can output either clean picture or burned in osd on the video out
12:30
g3gg0
right, i remember this example too. thanks
12:31
danieel
storing the osd layer separately would make sense
12:31
danieel
(use case: educational videos)
12:31
supragya
OSD?
12:31
supragya
on screen display?
12:31
danieel
on screen display - status information, grids, etc
12:32
g3gg0
but iirc that one had a much lower requirement for data rates etc
12:32
danieel
yep, you can do RLE on that data
12:32
supragya
OSD includes an overlay that we see on camera?
12:32
danieel
yes
12:32
g3gg0
so you could simply define a block OSDI (OSD Image) and store some compressed frames here with reduced rate
12:33
g3gg0
yeah
12:34
danieel
does MLV define per frame duration or just occurence timestamp?
12:34
g3gg0
timestamp
12:34
g3gg0
uint64_t timestamp hardware counter timestamp for this frame (relative to recording start)
12:34
supragya
what's this?
12:34
g3gg0
call it the presentation time
12:34
supragya
okay... the timestamp of frame is it?
12:34
g3gg0
yes
12:35
g3gg0
every block has its timestamp. thats the time when this block was captured/created
12:35
supragya
a custom WB has what options to configure? color temp... tint... and?
12:35
g3gg0
if you record a mlv with 1 fps, every VIDF will have the timesamp increased by 1000000
12:35
danieel
how do you define capture time with rolling shutter ? :)
12:35
danieel
(the phase)
12:36
g3gg0
iirc it was start of exposure of this frame
12:36
danieel
of first line I assume
12:36
g3gg0
so when the first line was exposed
12:36
supragya
I guessed that too :)
12:36
supragya
but never knew was correct
12:36
g3gg0
(not sure if our implementation in camera uhmm perfectly reflects this ;) )
12:37
danieel
that is often a variable location if you change exposure, as the refernce point is usually the conversion/readout of the line, that happens periodically. Exposure start varies into "negative time"
12:37
g3gg0
as long all frames have the same reference, its okay
12:37
g3gg0
so you mean that exposure might change in the middle of the frame?
12:37
g3gg0
i prefer vsync ;)
12:37
danieel
modern sensor wont allow to do that, some produce garbage
12:38
g3gg0
(makes not much sense anyway)
12:38
supragya
really getting lost out here... why would exposure change mid flight
12:38
g3gg0
within one frame would not make too much sense if you rely on existing toolchains
12:39
g3gg0
i'd take it as given that exposure time etc changes between frames
12:39
supragya
in that case a header induces this change for upcoming frames in MLV, don't it?
12:40
supragya
*doesn't it
12:40
g3gg0
EXPO / LENS block, yes
12:40
supragya
mid flight exposure changes would be more of less like rolling shutter + flash artifact
12:40
g3gg0
the VIDF after(*) the EXPO/LENS block which changed settings, will get affected
12:41
danieel
after is > or >= ?
12:41
g3gg0
(*) after: always related to the time -> timestamp
12:41
supragya
so g3gg0, regarding multi streams
12:41
g3gg0
in theory and physics these two events can not happen at the same time ;)
12:41
supragya
how are they implemented in MLV?
12:42
supragya
like expo/lens block that changes settings for upcoming vid frames?
12:42
g3gg0
if you produce a EXPO with ts=100 and a VIDF with ts=100 then you might be in trouble for the first line even when explaining the scenario
12:42
g3gg0
does the first line that is being exposed use the old settings or the new ones?
12:43
supragya
if you do as this... you swap between streams quite often... so multiple headers for no use..
12:43
supragya
can you explain this.... could not wrap my head around this
12:44
g3gg0
conclusion: we would do good to make sure the EXPO/LENS block is queued before the exposure starts, ergo i'd write them with a timestamp lower than the next frame
12:44
g3gg0
(but yeah, specification leaves this open)
12:45
g3gg0
@supragya: we have 20 bytes overhead then. for several megabyte blocks.
12:46
supragya
yes, but I was wondering if we could do better...
12:46
supragya
like constant offset checking...
12:46
g3gg0
offset checking?
12:47
supragya
(after every 100 MB maybe the stream info has to be rechecked) if it is still stream 1 or now
12:47
supragya
*not
12:47
supragya
in case you swap constantly, you write after a given offset
12:47
g3gg0
nah that reminds me of AVI and such
12:47
supragya
but I guess this solution is very inefficient
12:48
supragya
g3gg0: exactly my thought... AVI
12:48
danieel
i see the drawback of the need of scanning the whole file (with the modern caches you basically need to read the whole file to open it..)
12:48
supragya
inefficient as in.... buffer to disk flush
12:48
supragya
danieel: my worry too
12:49
g3gg0
@danieel: yes that is the downsinde. when you process the file, you have to build an index. in camera with its 200MHz cpu (?) it takes maybe 5 seconds for a gigabyte of video
12:49
ArunM
left the channel
12:49
g3gg0
*downside
12:49
supragya
MLV suggest we create an index on first read and then it would all be easier
12:49
BAndiT1983|away
changed nick to: BAndiT1983
12:49
supragya
hello BAndiT1983
12:50
g3gg0
but that index is being created once and then can be stored in an .idx file if you want to keep it
12:50
BAndiT1983
hi supragya
12:50
g3gg0
hi
12:50
danieel
sort of header only dump with data left out would be better?
12:50
danieel
ah ok
12:50
danieel
so you have that
12:50
BAndiT1983
hi g3gg0, reading the lgos all along, but decided to join while the other PC is importing large database dump
12:51
g3gg0
hehe, welcome
12:51
supragya
seems like lost in the sea... could I ask again... what all are changed when white balance changes?
12:51
g3gg0
yeah the index is some "comfort" feature like a cache
12:51
supragya
exposure? color temp? tint?
12:51
g3gg0
WB -> no change to raw data
12:52
BAndiT1983
hope you are focussing on general things first and not too much on details
12:52
g3gg0
its just the user's configuration what he wants to be considered in post
12:53
g3gg0
@BAndiT1983: yeah getting very deep occasionally. but i consider it to be necessary for the "big picture"
12:53
g3gg0
to understand what is important right now and what not
12:53
supragya
BAndiT1983: I know... but only then comparison would be meaningful
12:53
danieel
actually wb change could optimize the gains of individual channels - is there a separate GAIN meta info for this in mlv?
12:53
g3gg0
mlv_wbal_hdr_t
12:53
supragya
danieel: https://docs.google.com/spreadsheets/d/1ItNuLj34JlK6bZgiAJLrmG7p3iyCmCju4XYyiwjqJOM/edit#gid=0
12:54
g3gg0
uint32_t wbgain_r 2048 only when wb_mode is WB_CUSTOM uint32_t wbgain_g 1024 1024 = 1.0 uint32_t wbgain_b 2048 note: it's 1/canon_gain (uses dcraw convention)
12:54
g3gg0
hehe better.
12:54
g3gg0
line 144
12:54
danieel
no g2 gain? :)
12:54
g3gg0
in our canon-case this is only additional info
12:55
supragya
no g2 gain it seems
12:55
g3gg0
nope. want to set the second green gain seperately?
12:55
supragya
others 2048 -> g2 1024
12:55
supragya
*g
12:55
BAndiT1983
do you mean g1 and g2?
12:55
g3gg0
canon did this on some models but not sure if that is the holy grail
12:55
BAndiT1983
like in rg1g2b?
12:56
supragya
i think that is what he meant
12:56
danieel
yes that
12:56
BAndiT1983
what is the prupose to set different gain for g2?
12:56
danieel
its raw, so gains relate often to CFA (and sensors do that)
12:56
danieel
it might be a non RGGB sensor, like RGBA
12:57
danieel
*RGBIr
12:57
g3gg0
for these special cases -> new block type
12:57
supragya
new block type <- a solution to everything?
12:57
supragya
where is the MLV standard then?
12:58
g3gg0
if the given things dont fit at all, we have to extend it
12:58
danieel
MLV is a container, so new block type is acceptable for me :)
12:58
g3gg0
i'd try to keep all at most backward-compatible as it is possible
12:59
supragya
if everything can be extended and accomodated for in MLV, there is no need for comparison with other formats :)
12:59
BAndiT1983
mpeg allows the same, tiff also etc.
12:59
danieel
the video data "codec"... is like a packaged uncompressed or any alternatives are used? (dct/vle...)
12:59
supragya
no
13:00
g3gg0
there are two variants a) raw canon bayer bitstream b) JPEG92
13:00
supragya
but an new block if we need this XD
13:00
g3gg0
no sarcasm please ;)
13:00
danieel
canon raw is what exactly then?
13:00
g3gg0
(kidding)
13:00
BAndiT1983
offtopic a bit: one example for multiple video streams in a stream came to my mind from my study, we had to decode sattelite mpeg stream and there were multiple video streams plus additional packages, like date, UTC etc.
13:00
g3gg0
it is.... ugly
13:01
g3gg0
the 14 bit stream packed together into 16 bits with endianess reversion
13:01
g3gg0
probably an easy thing in FPGA
13:01
g3gg0
(definitely easy)
13:01
danieel
BAndiT1983: that is case generatlly with DVB as well - you get a stream of the multiplex you are tuned in. Guess where the multiplex name came from :P
13:02
supragya
BAndiT1983: satellite images have as many as 21 streams at once... I had some research consideration on that part in college
13:02
danieel
so it can support up to 16 bit anything (properly aligned to lsb/msb)
13:03
supragya
EDMAC alignment?
13:03
g3gg0
yes, mlv format can handle from (uuuuh 8?) to 16 bpp
13:03
BAndiT1983
wait till 4k will be streamed over sattelite, how it will reduce the bandwidth ;)
13:04
g3gg0
edmac isnt the culprit
13:04
supragya
i mean what is edmac alignment?
13:04
g3gg0
you mean frameSpace ?
13:04
supragya
ues
13:05
supragya
s/u/y
13:05
BAndiT1983
https://www.magiclantern.fm/forum/index.php?topic=17069.25
13:05
g3gg0
keep it zero and forget about that
13:05
g3gg0
set it to a value and this instructs the reader to ignore n bytes in the payload
13:05
supragya
n bytes hence?
13:06
supragya
n bytes from point of telling the value?
13:06
g3gg0
its just for some special cases to allow the writer to optimize the memory locations where DMA would write to
13:06
g3gg0
you could write in a way so that the raw payload starts at a sector start
13:07
g3gg0
(SD/CF card sector start)
13:07
supragya
what are the reasons of additions of mlv_colp_hdr_t
13:07
supragya
?
13:07
g3gg0
or in our case primarily the DMA destination address limitations.
13:08
g3gg0
nah just drop that
13:08
supragya
okay
13:09
g3gg0
wondered if we could store the bayer pixel information in some "generic" way
13:09
g3gg0
along with color matrices and such
13:09
BAndiT1983
bson?
13:09
g3gg0
bson? ?
13:09
supragya
as in if it is RGGB or GRGB?
13:09
BAndiT1983
binary json
13:10
g3gg0
json -> /me: screaming and running away
13:10
g3gg0
;)
13:10
BAndiT1983
:D
13:10
g3gg0
@supragya: yep and the bit coding and how to get RGB from them
13:10
BAndiT1983
what is the generic way for you?
13:10
g3gg0
it is empty. none found. :)
13:10
supragya
i though bson -> bison (grammar parser)... reminds me of last GSoC prep
13:11
g3gg0
hehe
13:11
BAndiT1983
give me a bit more info on this topic, as cfa pattern is in MLV, but what do we need besides it?
13:11
supragya
[bit coding and how to get RGB from them] <- example?
13:11
BAndiT1983
flex and bison, nostalgy here, as there was a nice tutorial series back then, about own scripting language development
13:11
supragya
wrote my own
13:12
supragya
hlang.supragyaraj.com
13:12
BAndiT1983
now we have 5 millions and 1 scripting languages out there ;)
13:12
BAndiT1983
you have used ubuntu naming scheme for it? ;)
13:12
g3gg0
https://xkcd.com/927/
13:13
supragya
XD
13:13
BAndiT1983
Bertl_oO also uses that XKCD a lot here :D
13:13
g3gg0
yeah, now talk about MLV
13:13
g3gg0
;)
13:13
supragya
g3gg0: thank you :)
13:13
supragya
yes... [bit coding and how to get RGB from them] about this
13:14
BAndiT1983
so, bit coding?
13:14
g3gg0
i have an excuse: MLV is designed to be simple and written and read with few resources
13:14
g3gg0
ok back to topic
13:14
BAndiT1983
is it about something like current raw12?
13:14
g3gg0
https://bitbucket.org/hudson/magic-lantern/src/0e6493e8ac5e118976b94237b5048c436f379d98/src/raw.h?at=unified&fileviewer=file-view-default
13:14
BAndiT1983
MLV is not the problem, it solves the overencumbering with many formats
13:15
g3gg0
see the header
13:15
g3gg0
it explains how bits are stored
13:15
BAndiT1983
know that header
13:15
BAndiT1983
14bit stream, which gave me a bit of headache to decode back then
13:16
g3gg0
yeah
13:16
supragya
is that a direction I should look at for my answer?
13:16
g3gg0
thats how "non-compressed" data is stored
13:17
danieel
it would be enough to say it is TIFF compatible (or that it is not)... it defines tight packing well
13:19
BAndiT1983
before packing considerations we should consult FPGA developer
13:20
supragya
I guess I more or less understand what MLV looks like... time to know others (MPX maybe)
13:20
g3gg0
cool
13:20
g3gg0
thats the important part of your work.
13:21
danieel
i know DNG / MP4 if you have questions
13:21
supragya
those into consideration too... I wonder which is better to look at first
13:21
g3gg0
@supragya: you know https://www.magiclantern.fm/forum/index.php?topic=13152.0
13:21
g3gg0
MLVFS
13:22
supragya
yes made a frameserver for AVI
13:22
supragya
on fuse
13:22
g3gg0
wait. you made it already?
13:23
supragya
on verge... need about 3-4 days rigorous work
13:23
supragya
wait a moment
13:23
BAndiT1983
g3gg0, it was a simple one, so we can test if OC can provide data
13:23
BAndiT1983
through AVI
13:24
supragya
link: https://nofile.io/f/zejFC7uhNPg/GSoC+2018+-+libfuse-FrameServer+_+apertus.org.pdf
13:24
supragya
(my rejected proposal :( )
13:24
supragya
the current work explained in section 1
13:25
g3gg0
ah okay. it is in concept state, right?
13:25
supragya
kindoff yes... did a bit of test of some modules..
13:26
g3gg0
ok
13:26
supragya
why do you ask
13:26
ArunM
joined the channel
13:26
g3gg0
was confused because i sent you the MLVFS link and you replied you made an avi frameserver. i thought you put it into MLVFS :D
13:27
supragya
:P
13:32
g3gg0
btw i liked your proposal. but it made to me more sense to define the file format first
13:33
supragya
thanks g3gg0, I think I would be going for now... some household errands... will get in touch if find something else
13:33
ArunM
left the channel
13:34
g3gg0
sure. same here. lawn mowing in the sun. know better things to do
13:34
g3gg0
things that don't involve sun
13:34
BAndiT1983
like home office?
13:35
BAndiT1983
you can be glad you have sun, currently dark clouds are approaching frankfurt
13:35
supragya
seems like BAndiT1983 is in trouble! storm approaching XD
13:36
g3gg0
home office. oh well, i said i wont work today. but still i checked and replied mails.
13:36
BAndiT1983
had to opt to home office, but it's a bit tedious as i'm wating mostly for the PC to compile or to do other stuff, not much coding because of that
13:37
g3gg0
for which target?
13:37
BAndiT1983
when it's over then i will sit down on the balcony and do some stuff for opencine, maybe also some work on MLV loader, so i can playback stuff finally
13:37
supragya
some age old technology he told me
13:37
BAndiT1983
it's GWT (google web toolkit) and java server
13:38
BAndiT1983
nothing exciting
13:38
g3gg0
oh i expected 6502 now
13:38
ArunM
joined the channel
13:38
ArunM
left the channel
13:38
g3gg0
or 6510
13:39
BAndiT1983
6502 would be faster, than this bloated java junk, i know java can be fast and so on, but people tend to write overencumbered code a lot
13:39
g3gg0
yeah. signed bytes. yay
13:40
ArunM
joined the channel
13:40
supragya
[you can be glad you have sun, currently dark clouds are approaching frankfurt] <- I am not glad, and I have sun outside... 45C /113F
13:41
g3gg0
haha win
13:41
BAndiT1983
that's why i'm not looking forward to dubai
13:41
BAndiT1983
there is also around 45-50°C
13:42
supragya
last day my skin turned red... when I went to Indore
13:42
supragya
there had to roam a lot
13:42
g3gg0
cant you prepare sous vide at these temperatures?
13:42
BAndiT1983
funny name, reminds me of indoor ;)
13:42
g3gg0
;)
13:42
supragya
I guessed this may come up
13:43
BAndiT1983
just look for a black car and dump some eggs on the trunk, a couple of minutes and you have a meal
13:43
g3gg0
as long you remove the insects before, why not
13:43
BAndiT1983
pffff, why should one, proteins all the way
13:43
g3gg0
;D
13:43
supragya
better yet, get whey
13:44
g3gg0
okay nice chat. gonna leave now
13:44
g3gg0
have a nice day!
13:44
BAndiT1983
you too
13:45
supragya
same
13:46
supragya
left the channel
13:48
ArunM
left the channel
14:05
BAndiT1983
changed nick to: BAndiT1983|away
14:28
BAndiT1983|away
changed nick to: BAndiT1983
15:37
BogdanSorlea
joined the channel
15:44
BogdanSorlea
I have a small question about "4K HDMI Plugin Module (1x slot)" (status: not started): compared to the finalised "1x HDMI Plugin Module (1x slot)", does the aforementioned one (4K) require an actual different extension board (so new PCB/electronics design), or would it work with the finalised HDMI module (1080p), the only difference being that it w
15:44
BogdanSorlea
ould require a different architecture/design running on the FPGA and or CPU - so basically just different code running in the devices on the camera(?)
15:44
BogdanSorlea
(I'm mostly curious right now)
15:46
se6astian
hi BogdanSorlea
15:47
se6astian
the 4k hdmi module will require a completely new PCB with FPGA on the module itself
15:47
se6astian
similar to the SDI plugin module
15:47
se6astian
the microzed fpga is not providing fast enough IO for 4K HDMI
15:49
ArunM
joined the channel
15:49
BogdanSorlea
ah, I see
15:50
ArunM
left the channel
15:59
danieel
se6astian: advice: dont spec 4K, but rather 4K30 or 4K60 because those have quite different needs (or better would be to say HDMI1.4 vs HDMI2.0)
16:19
ArunM
joined the channel
16:26
intrac
joined the channel
16:43
se6astian
changed nick to: se6astian|away
16:44
BogdanSorlea
left the channel
16:46
Kjetil
4k120 :)
16:58
ArunM
left the channel
16:59
BAndiT1983
changed nick to: BAndiT1983|away
17:00
RexOrCine
changed nick to: RexOrCine|away
17:32
ArunM
joined the channel
17:42
RexOrCine|away
changed nick to: RexOrCine
17:43
supragya
joined the channel
17:47
BAndiT1983|away
changed nick to: BAndiT1983
18:01
seaman_
joined the channel
18:02
supragya
left the channel
18:28
ArunM
left the channel
18:29
ArunM
joined the channel
18:29
ArunM
left the channel
18:29
ArunM
joined the channel
18:34
ArunM
left the channel
18:35
ArunM
joined the channel
18:39
supragya
joined the channel
18:45
se6astian|away
changed nick to: se6astian
18:52
ArunM
left the channel
18:57
ArunM
joined the channel
19:06
supragya
left the channel
19:12
TofuLynx
joined the channel
19:12
TofuLynx
Hello Everyone!
19:13
TofuLynx
Thanks supragya ! :D
19:13
TofuLynx
And, BAndiT1983, no I didn't drink and code xD
19:30
BAndiT1983
hi TofuLynx
19:30
BAndiT1983
yep, seems like repo is still okay ;)
19:30
BAndiT1983
btw. have you tried to activate travis ci for your fork yet?
19:31
BAndiT1983
added dummy unit test and let it fail on purpose, haven't investigated yet how to build separate jobs on travis ci with separate badges, so we can see if just tests or also build fails
19:32
BAndiT1983
currently trying to implement a unit test for RGB extraction, but it requires a bit of restructuring, which is not a bad thing
19:32
TofuLynx
not yet!
19:32
BAndiT1983
and also memory pool testing was started, but it's not building correctly yet
19:33
TofuLynx
I'm currently investigating how many threads are used for the even and odd rows
19:33
TofuLynx
in the original cycles
19:35
TofuLynx
memory pool = pool allocator?
19:38
BAndiT1983
don't get stuck on optimization, push it to later phase
19:38
BAndiT1983
yep, memory pool is pool allocator in that case
19:40
TofuLynx
hmm
19:40
TofuLynx
std::thread can "give" more than one system thread to the function?
19:40
BAndiT1983
don'T think so, but maybe my knowledge is not up to date
19:43
TofuLynx
So far my research it doesnt seem to
19:43
TofuLynx
I will postpone this, I guess
19:43
TofuLynx
Have to avoid premature optimization
19:43
TofuLynx
I think I will commit the old loops, what do you think?
19:46
BAndiT1983
you can do if you want, currently it's not that important and can be postponed, soem unit tests will show us what to expect there
19:46
g3gg0
left the channel
19:47
BAndiT1983
don't be afraid to play around with the code, sometimes it helps to get the picture and to set the things right, even if it looks broken first
19:48
TofuLynx
Ok! :)
19:48
TofuLynx
also, is there a way to add a new class directly into a project on QtCreator?
19:51
BAndiT1983
nope, at least couldn't find one for cmake projects, you have to open the folder by right-clicking the entry and add new file thre
19:51
BAndiT1983
*there
19:51
TofuLynx
ok! :)
19:51
BAndiT1983
same here -> http://www.qtcentre.org/threads/30857-Adding-files-to-cmake-project
19:52
TofuLynx
I see
19:52
BAndiT1983
partially i'm doing stuff also in vscode with cmake plugin, so i can build from there also, and of course adding files or folders works from the GUI
19:53
TofuLynx
yeah!
20:09
Kjetil
Do you really need to have a CMake-plugin? The project files are^Wshould be quite easy to change
20:11
BAndiT1983
it's for a quick build from vscode, so i don't have to change forth and back, also debugging was working nicely
20:13
BAndiT1983
so, off for short time, will be back later
20:13
BAndiT1983
changed nick to: BAndiT1983|away
20:30
TofuLynx
left the channel
20:59
TofuLynx
joined the channel
20:59
TofuLynx
Back
20:59
TofuLynx
went dinner
21:06
BAndiT1983|away
changed nick to: BAndiT1983
21:23
TofuLynx
BAndiT1983: You were planing to add openMP to OC?
21:26
BAndiT1983
openmp tests were already there, but performance-wise not much gain, maybe new tests should be done
21:26
TofuLynx
Ok :)
21:33
BAndiT1983
but still, please focus on functionality, performance is not important first, also compiler optimizations are still not activated, like -O3
21:33
TofuLynx
Ok!
21:34
TofuLynx
I'm now creating the ProcessRed method of DebayerBilinear Class
21:34
TofuLynx
I'm thinking about how I'll do the loop
21:36
TofuLynx
the existing bilinear class has a glitch in the borders,
21:36
TofuLynx
I think I have a fast and easy solution for it
21:36
TofuLynx
First I would do a loop that iterates to every pixel that is not in the border
21:37
TofuLynx
and only then I would do a loop that does nearest interpolation in the border, avoiding glitches
21:37
TofuLynx
However, I'll postpone that for later. First a basic bilinear debayer
21:46
intrac
left the channel
21:46
intrac_
joined the channel
21:49
RexOrCine
changed nick to: RexOrCine|away
21:49
intrac
joined the channel
21:51
intrac_
left the channel
22:18
ArunM
left the channel
22:50
se6astian
changed nick to: se6astian|away
22:56
TofuLynx
BAndiT1983: the class structure is finished I think, I just have to finish the processingRGB methods
23:03
BAndiT1983
ok, looking forward to it, there will be some changes from my side soon too, but first the unit test has to be finished
23:20
TofuLynx
left the channel
23:22
slikdigit
joined the channel
23:29
TofuLynx
joined the channel
23:29
TofuLynx
Off for today
23:29
TofuLynx
Good night everyone!
23:30
TofuLynx
left the channel
23:45
BAndiT1983
changed nick to: BAndiT1983|away
23:56
slikdigit
left the channel
23:56
slikdigit
joined the channel
00:05
slikdigit
left the channel
00:06
slikdigit
joined the channel