Current Server Time: 10:27 (Central Europe)

#apertus IRC Channel Logs

2018/04/30

Timezone: UTC


01:31
rton
left the channel
01:57
supragya
joined the channel
02:02
Bertl_oO
off to bed now ... night everyone!
02:02
Bertl_oO
changed nick to: Bertl_zZ
03:25
futarisIRCcloud
joined the channel
03:53
illwieckz
left the channel
04:00
illwieckz
joined the channel
04:12
supragya
left the channel
04:58
supragya
joined the channel
06:03
supragya
left the channel
07:32
se6astian|away
changed nick to: se6astian
07:57
niemand
joined the channel
07:57
niemand
left the channel
07:57
niemand
joined the channel
08:01
Bertl_zZ
changed nick to: Bertl
08:01
Bertl
morning folks!
08:21
g3gg0
joined the channel
08:21
g3gg0
hi
08:24
se6astian
good day
08:35
TycoKiaz
joined the channel
08:40
TycoKiaz
left the channel
08:41
rton
joined the channel
09:00
ArunM
joined the channel
09:05
futarisIRCcloud
left the channel
09:21
RexOrCine|away
changed nick to: RexOrCine
09:38
niemand
left the channel
09:50
niemand
joined the channel
10:13
intrac
left the channel
10:24
supragya
joined the channel
10:24
supragya
hi @ g3gg0
10:28
supragya
*sorry, wasn't available 2 hours ago, read the mail just now.
10:30
supragya
se6astian: Is dual ISO something that Apertus plans to implement in AXIOM in foreseeable future? For HDR?
10:36
danieeel
that depends whether sensor has dual-iso and the CMV12000 has not
10:36
danieeel
but it has other ways to achieve HDR
10:36
supragya
such as ..
10:37
danieeel
piecewise linear response curve
10:37
danieeel
changed nick to: danieel
10:40
Bertl
the CMV12k can do different exposures on even and odd rows
10:41
danieel
has that some name?
10:41
danieel
dual-expo? :)
10:41
supragya
I was wondering the same :)
10:41
supragya
http://acoutts.com/a1ex/dual_iso.pdf
10:42
danieel
dual iso is when you modify the conversion gain, not the exposure, otherwise you get some nasty artifacts
10:49
Bertl
IMHO the piecewise linear HDR is more powerful anyway ...
10:49
Bertl
off for now ... bbl
10:49
Bertl
changed nick to: Bertl_oO
10:52
Bertl_oO
and I guess the name would be dual-exposure, no? :)
10:55
supragya
I was wondering if there will be cases when two cameras' output needs to be interlaced onto a single file... 1 thought ran up my mind was a 3D rig... Any other reasons why one may interlace these into 2 STREAMS? Is it meaningful to have 2 STREAMS of video on single file in 3D rig?
10:55
supragya
interlaced as in... into separate streams
10:57
danieel
i would assume it makes sense to have it in 1 file only as in delivery format, not much as the raw capture
10:57
danieel
usual post works in a way you throw into it a lot of input footage and it produces something meaningful
10:58
danieel
having to select substreams is extra work :)
10:58
supragya
can you explain this? would you like for this to have in camera sync if it were two streams in single file? I think this would make sense
10:58
danieel
i do multicam VR / LF and sources are better kepts separate
10:59
danieel
camera sync is about timing of sensors, completely unrelated to muxing two video tracks into a single file
11:00
g3gg0
hi. back from dinner
11:00
supragya
sync as in ordering and timing of two streams with respect to each other
11:01
supragya
hi g3gg0 !
11:04
g3gg0
dual-iso: exactly. the big plus is that the interlaced lines have the same exposure time, preventing some tearing artefacts
11:05
g3gg0
still it was a technical solution for a problem that maybe apertucs cams initally dont have
11:05
g3gg0
e.g. if you can use non-linear sensors
11:06
supragya
dual expo has varied time under exposure for odd and even lines?
11:06
supragya
that would create tearing... more of ghosting artifact no?
11:07
niemand
left the channel
11:07
g3gg0
well maybe thats the better technical term, syes
11:07
g3gg0
*yes
11:07
g3gg0
varying exposure time is good for still photography
11:08
supragya
but not for vids... I guess HDR is not for that anyways! does any of these - PLR require additional support in file format?
11:08
g3gg0
i am not aware of all technical solutions out there. but dual iso was a very fitting one
11:09
g3gg0
PLR?
11:09
supragya
Piecewise linear response (https://wiki.apertus.org/index.php/PLR)
11:09
g3gg0
syncinc cameras. yeah, 3D. a friend of mine built those systems for large productions
11:09
g3gg0
you have to be very exact when aligning and syncing those 3D cameras
11:10
g3gg0
even the slightest error can cause trouble for the viewers in cinemas even if you cannot really measure any difference
11:11
g3gg0
ah, PLR ok. dont know if we have to transport additional data from the camera sensor to the developing application - do we?
11:11
supragya
If odds are on different exposure level than even lines, would it make sense to capture the two settings into an additional header?
11:12
g3gg0
*if* thats done, you can define your own blocks for that
11:12
supragya
a newer version of MLV you say?
11:13
g3gg0
hehe maybe its not clear. MLV's structure does not restrict the number and types of blocks that you use
11:13
supragya
I know that from day 1 :)
11:13
supragya
skip what you don't understand - for readers
11:13
g3gg0
so adding a block doesnt make a new MLV version
11:14
supragya
hmm... okay
11:14
supragya
what makes a new version then?
11:14
g3gg0
incompatible changes to the file format
11:15
g3gg0
when e.g. a VIDF suddenly has a 64 bit field for uhhh frame number ;)
11:15
supragya
g3gg0: can I ask the thinking that went behind v2.1
11:15
supragya
how it is to be used?
11:16
g3gg0
you mean the field in header MLVI ?
11:16
supragya
mlv_subc_hdr_t
11:16
g3gg0
ah okay sure
11:17
supragya
nice kid photo on G+ :), your son?
11:17
g3gg0
the question was (a year or so ago) how MLV could handle multiple camera video streams
11:17
g3gg0
oh eh yeah
11:17
g3gg0
the one with my first digital camera? 4 years ago or so
11:17
supragya
:)
11:18
supragya
multiple camera video streams <- In which cases would you need it? Since for example danieel suggests that he may want two streams separate in files
11:18
supragya
in two files I mean
11:18
danieel
PLR in file format: add a LinearizatioTable (reverse LUT) as in DNG
11:20
g3gg0
okay one question - is the stream to be made by *one* processor who merges the streams and writes the files?
11:20
supragya
I think, no
11:20
g3gg0
or is the rig a genlock'd one with two cameras?
11:20
supragya
the second one
11:21
g3gg0
then its as easy as this: genlock the two cameras, hardsync the record start and write two separate files (chunked is also okay)
11:22
supragya
genlock?
11:22
g3gg0
locking the two system clocks
11:22
supragya
got it
11:22
g3gg0
so they clock at the same rate -> same FPS, same timebase
11:22
g3gg0
in MLV every block has its timestamp relative to "recording start"
11:23
supragya
if that is sorted, why would you need streams? couldn't figure any other case
11:23
g3gg0
it is stored in microseconds
11:23
supragya
since only one sensor per camera can give one stream
11:23
g3gg0
can you rephrase the question, i dont understand it
11:23
g3gg0
what is sorted and which streams do you mean
11:24
supragya
why would you need mulitple streams in a video file ?
11:24
g3gg0
ah okay for use cases where one camera has more sensors
11:24
g3gg0
there was some time ago the theoretical use case of a single camera with two sensors iirc
11:25
supragya
o.O
11:25
g3gg0
maybe for 3D or any other use case
11:25
g3gg0
and i was asked if MLV could handle that. i said that it would be not hard to add "subchannels"
11:25
g3gg0
also for audio
11:26
supragya
It makes sense to keep the RAW O/P separate into files that are genlock(ed) as you said... was having a single CPU a consideration for multi stream vs multi file
11:26
g3gg0
no idea if anyone has this goal, but would be simple to add
11:27
g3gg0
well, they can get merged later to have one file per scene
11:27
supragya
so in conclusion, am I wrong in saying that there is a limited use case of adding multiple video streams in a single file
11:29
g3gg0
i dont know of any highly critical workflow that requires one file with separate streams.
11:29
g3gg0
that doesnt speak of everyone involved :)
11:29
g3gg0
*for
11:29
danieel
the multistream file is useful as an editing format (multiple video layers)
11:30
g3gg0
for one scene or for 3D?
11:30
supragya
could you elaborate this use case?
11:30
g3gg0
or both? :D
11:30
danieel
directly camera related could be to have an OSD in separate video track
11:30
danieel
some cameras can output either clean picture or burned in osd on the video out
11:30
g3gg0
right, i remember this example too. thanks
11:31
danieel
storing the osd layer separately would make sense
11:31
danieel
(use case: educational videos)
11:31
supragya
OSD?
11:31
supragya
on screen display?
11:31
danieel
on screen display - status information, grids, etc
11:32
g3gg0
but iirc that one had a much lower requirement for data rates etc
11:32
danieel
yep, you can do RLE on that data
11:32
supragya
OSD includes an overlay that we see on camera?
11:32
danieel
yes
11:32
g3gg0
so you could simply define a block OSDI (OSD Image) and store some compressed frames here with reduced rate
11:33
g3gg0
yeah
11:34
danieel
does MLV define per frame duration or just occurence timestamp?
11:34
g3gg0
timestamp
11:34
g3gg0
uint64_t timestamp hardware counter timestamp for this frame (relative to recording start)
11:34
supragya
what's this?
11:34
g3gg0
call it the presentation time
11:34
supragya
okay... the timestamp of frame is it?
11:34
g3gg0
yes
11:35
g3gg0
every block has its timestamp. thats the time when this block was captured/created
11:35
supragya
a custom WB has what options to configure? color temp... tint... and?
11:35
g3gg0
if you record a mlv with 1 fps, every VIDF will have the timesamp increased by 1000000
11:35
danieel
how do you define capture time with rolling shutter ? :)
11:35
danieel
(the phase)
11:36
g3gg0
iirc it was start of exposure of this frame
11:36
danieel
of first line I assume
11:36
g3gg0
so when the first line was exposed
11:36
supragya
I guessed that too :)
11:36
supragya
but never knew was correct
11:36
g3gg0
(not sure if our implementation in camera uhmm perfectly reflects this ;) )
11:37
danieel
that is often a variable location if you change exposure, as the refernce point is usually the conversion/readout of the line, that happens periodically. Exposure start varies into "negative time"
11:37
g3gg0
as long all frames have the same reference, its okay
11:37
g3gg0
so you mean that exposure might change in the middle of the frame?
11:37
g3gg0
i prefer vsync ;)
11:37
danieel
modern sensor wont allow to do that, some produce garbage
11:38
g3gg0
(makes not much sense anyway)
11:38
supragya
really getting lost out here... why would exposure change mid flight
11:38
g3gg0
within one frame would not make too much sense if you rely on existing toolchains
11:39
g3gg0
i'd take it as given that exposure time etc changes between frames
11:39
supragya
in that case a header induces this change for upcoming frames in MLV, don't it?
11:40
supragya
*doesn't it
11:40
g3gg0
EXPO / LENS block, yes
11:40
supragya
mid flight exposure changes would be more of less like rolling shutter + flash artifact
11:40
g3gg0
the VIDF after(*) the EXPO/LENS block which changed settings, will get affected
11:41
danieel
after is > or >= ?
11:41
g3gg0
(*) after: always related to the time -> timestamp
11:41
supragya
so g3gg0, regarding multi streams
11:41
g3gg0
in theory and physics these two events can not happen at the same time ;)
11:41
supragya
how are they implemented in MLV?
11:42
supragya
like expo/lens block that changes settings for upcoming vid frames?
11:42
g3gg0
if you produce a EXPO with ts=100 and a VIDF with ts=100 then you might be in trouble for the first line even when explaining the scenario
11:42
g3gg0
does the first line that is being exposed use the old settings or the new ones?
11:43
supragya
if you do as this... you swap between streams quite often... so multiple headers for no use..
11:43
supragya
can you explain this.... could not wrap my head around this
11:44
g3gg0
conclusion: we would do good to make sure the EXPO/LENS block is queued before the exposure starts, ergo i'd write them with a timestamp lower than the next frame
11:44
g3gg0
(but yeah, specification leaves this open)
11:45
g3gg0
@supragya: we have 20 bytes overhead then. for several megabyte blocks.
11:46
supragya
yes, but I was wondering if we could do better...
11:46
supragya
like constant offset checking...
11:46
g3gg0
offset checking?
11:47
supragya
(after every 100 MB maybe the stream info has to be rechecked) if it is still stream 1 or now
11:47
supragya
*not
11:47
supragya
in case you swap constantly, you write after a given offset
11:47
g3gg0
nah that reminds me of AVI and such
11:47
supragya
but I guess this solution is very inefficient
11:48
supragya
g3gg0: exactly my thought... AVI
11:48
danieel
i see the drawback of the need of scanning the whole file (with the modern caches you basically need to read the whole file to open it..)
11:48
supragya
inefficient as in.... buffer to disk flush
11:48
supragya
danieel: my worry too
11:49
g3gg0
@danieel: yes that is the downsinde. when you process the file, you have to build an index. in camera with its 200MHz cpu (?) it takes maybe 5 seconds for a gigabyte of video
11:49
ArunM
left the channel
11:49
g3gg0
*downside
11:49
supragya
MLV suggest we create an index on first read and then it would all be easier
11:49
BAndiT1983|away
changed nick to: BAndiT1983
11:49
supragya
hello BAndiT1983
11:50
g3gg0
but that index is being created once and then can be stored in an .idx file if you want to keep it
11:50
BAndiT1983
hi supragya
11:50
g3gg0
hi
11:50
danieel
sort of header only dump with data left out would be better?
11:50
danieel
ah ok
11:50
danieel
so you have that
11:50
BAndiT1983
hi g3gg0, reading the lgos all along, but decided to join while the other PC is importing large database dump
11:51
g3gg0
hehe, welcome
11:51
supragya
seems like lost in the sea... could I ask again... what all are changed when white balance changes?
11:51
g3gg0
yeah the index is some "comfort" feature like a cache
11:51
supragya
exposure? color temp? tint?
11:51
g3gg0
WB -> no change to raw data
11:52
BAndiT1983
hope you are focussing on general things first and not too much on details
11:52
g3gg0
its just the user's configuration what he wants to be considered in post
11:53
g3gg0
@BAndiT1983: yeah getting very deep occasionally. but i consider it to be necessary for the "big picture"
11:53
g3gg0
to understand what is important right now and what not
11:53
supragya
BAndiT1983: I know... but only then comparison would be meaningful
11:53
danieel
actually wb change could optimize the gains of individual channels - is there a separate GAIN meta info for this in mlv?
11:53
g3gg0
mlv_wbal_hdr_t
11:53
supragya
danieel: https://docs.google.com/spreadsheets/d/1ItNuLj34JlK6bZgiAJLrmG7p3iyCmCju4XYyiwjqJOM/edit#gid=0
11:54
g3gg0
uint32_t wbgain_r 2048 only when wb_mode is WB_CUSTOM uint32_t wbgain_g 1024 1024 = 1.0 uint32_t wbgain_b 2048 note: it's 1/canon_gain (uses dcraw convention)
11:54
g3gg0
hehe better.
11:54
g3gg0
line 144
11:54
danieel
no g2 gain? :)
11:54
g3gg0
in our canon-case this is only additional info
11:55
supragya
no g2 gain it seems
11:55
g3gg0
nope. want to set the second green gain seperately?
11:55
supragya
others 2048 -> g2 1024
11:55
supragya
*g
11:55
BAndiT1983
do you mean g1 and g2?
11:55
g3gg0
canon did this on some models but not sure if that is the holy grail
11:55
BAndiT1983
like in rg1g2b?
11:56
supragya
i think that is what he meant
11:56
danieel
yes that
11:56
BAndiT1983
what is the prupose to set different gain for g2?
11:56
danieel
its raw, so gains relate often to CFA (and sensors do that)
11:56
danieel
it might be a non RGGB sensor, like RGBA
11:57
danieel
*RGBIr
11:57
g3gg0
for these special cases -> new block type
11:57
supragya
new block type <- a solution to everything?
11:57
supragya
where is the MLV standard then?
11:58
g3gg0
if the given things dont fit at all, we have to extend it
11:58
danieel
MLV is a container, so new block type is acceptable for me :)
11:58
g3gg0
i'd try to keep all at most backward-compatible as it is possible
11:59
supragya
if everything can be extended and accomodated for in MLV, there is no need for comparison with other formats :)
11:59
BAndiT1983
mpeg allows the same, tiff also etc.
11:59
danieel
the video data "codec"... is like a packaged uncompressed or any alternatives are used? (dct/vle...)
11:59
supragya
no
12:00
g3gg0
there are two variants a) raw canon bayer bitstream b) JPEG92
12:00
supragya
but an new block if we need this XD
12:00
g3gg0
no sarcasm please ;)
12:00
danieel
canon raw is what exactly then?
12:00
g3gg0
(kidding)
12:00
BAndiT1983
offtopic a bit: one example for multiple video streams in a stream came to my mind from my study, we had to decode sattelite mpeg stream and there were multiple video streams plus additional packages, like date, UTC etc.
12:00
g3gg0
it is.... ugly
12:01
g3gg0
the 14 bit stream packed together into 16 bits with endianess reversion
12:01
g3gg0
probably an easy thing in FPGA
12:01
g3gg0
(definitely easy)
12:01
danieel
BAndiT1983: that is case generatlly with DVB as well - you get a stream of the multiplex you are tuned in. Guess where the multiplex name came from :P
12:02
supragya
BAndiT1983: satellite images have as many as 21 streams at once... I had some research consideration on that part in college
12:02
danieel
so it can support up to 16 bit anything (properly aligned to lsb/msb)
12:03
supragya
EDMAC alignment?
12:03
g3gg0
yes, mlv format can handle from (uuuuh 8?) to 16 bpp
12:03
BAndiT1983
wait till 4k will be streamed over sattelite, how it will reduce the bandwidth ;)
12:04
g3gg0
edmac isnt the culprit
12:04
supragya
i mean what is edmac alignment?
12:04
g3gg0
you mean frameSpace ?
12:04
supragya
ues
12:05
supragya
s/u/y
12:05
BAndiT1983
https://www.magiclantern.fm/forum/index.php?topic=17069.25
12:05
g3gg0
keep it zero and forget about that
12:05
g3gg0
set it to a value and this instructs the reader to ignore n bytes in the payload
12:05
supragya
n bytes hence?
12:06
supragya
n bytes from point of telling the value?
12:06
g3gg0
its just for some special cases to allow the writer to optimize the memory locations where DMA would write to
12:06
g3gg0
you could write in a way so that the raw payload starts at a sector start
12:07
g3gg0
(SD/CF card sector start)
12:07
supragya
what are the reasons of additions of mlv_colp_hdr_t
12:07
supragya
?
12:07
g3gg0
or in our case primarily the DMA destination address limitations.
12:08
g3gg0
nah just drop that
12:08
supragya
okay
12:09
g3gg0
wondered if we could store the bayer pixel information in some "generic" way
12:09
g3gg0
along with color matrices and such
12:09
BAndiT1983
bson?
12:09
g3gg0
bson? ?
12:09
supragya
as in if it is RGGB or GRGB?
12:09
BAndiT1983
binary json
12:10
g3gg0
json -> /me: screaming and running away
12:10
g3gg0
;)
12:10
BAndiT1983
:D
12:10
g3gg0
@supragya: yep and the bit coding and how to get RGB from them
12:10
BAndiT1983
what is the generic way for you?
12:10
g3gg0
it is empty. none found. :)
12:10
supragya
i though bson -> bison (grammar parser)... reminds me of last GSoC prep
12:11
g3gg0
hehe
12:11
BAndiT1983
give me a bit more info on this topic, as cfa pattern is in MLV, but what do we need besides it?
12:11
supragya
[bit coding and how to get RGB from them] <- example?
12:11
BAndiT1983
flex and bison, nostalgy here, as there was a nice tutorial series back then, about own scripting language development
12:11
supragya
wrote my own
12:12
supragya
hlang.supragyaraj.com
12:12
BAndiT1983
now we have 5 millions and 1 scripting languages out there ;)
12:12
BAndiT1983
you have used ubuntu naming scheme for it? ;)
12:12
g3gg0
https://xkcd.com/927/
12:13
supragya
XD
12:13
BAndiT1983
Bertl_oO also uses that XKCD a lot here :D
12:13
g3gg0
yeah, now talk about MLV
12:13
g3gg0
;)
12:13
supragya
g3gg0: thank you :)
12:13
supragya
yes... [bit coding and how to get RGB from them] about this
12:14
BAndiT1983
so, bit coding?
12:14
g3gg0
i have an excuse: MLV is designed to be simple and written and read with few resources
12:14
g3gg0
ok back to topic
12:14
BAndiT1983
is it about something like current raw12?
12:14
g3gg0
https://bitbucket.org/hudson/magic-lantern/src/0e6493e8ac5e118976b94237b5048c436f379d98/src/raw.h?at=unified&fileviewer=file-view-default
12:14
BAndiT1983
MLV is not the problem, it solves the overencumbering with many formats
12:15
g3gg0
see the header
12:15
g3gg0
it explains how bits are stored
12:15
BAndiT1983
know that header
12:15
BAndiT1983
14bit stream, which gave me a bit of headache to decode back then
12:16
g3gg0
yeah
12:16
supragya
is that a direction I should look at for my answer?
12:16
g3gg0
thats how "non-compressed" data is stored
12:17
danieel
it would be enough to say it is TIFF compatible (or that it is not)... it defines tight packing well
12:19
BAndiT1983
before packing considerations we should consult FPGA developer
12:20
supragya
I guess I more or less understand what MLV looks like... time to know others (MPX maybe)
12:20
g3gg0
cool
12:20
g3gg0
thats the important part of your work.
12:21
danieel
i know DNG / MP4 if you have questions
12:21
supragya
those into consideration too... I wonder which is better to look at first
12:21
g3gg0
@supragya: you know https://www.magiclantern.fm/forum/index.php?topic=13152.0
12:21
g3gg0
MLVFS
12:22
supragya
yes made a frameserver for AVI
12:22
supragya
on fuse
12:22
g3gg0
wait. you made it already?
12:23
supragya
on verge... need about 3-4 days rigorous work
12:23
supragya
wait a moment
12:23
BAndiT1983
g3gg0, it was a simple one, so we can test if OC can provide data
12:23
BAndiT1983
through AVI
12:24
supragya
link: https://nofile.io/f/zejFC7uhNPg/GSoC+2018+-+libfuse-FrameServer+_+apertus.org.pdf
12:24
supragya
(my rejected proposal :( )
12:24
supragya
the current work explained in section 1
12:25
g3gg0
ah okay. it is in concept state, right?
12:25
supragya
kindoff yes... did a bit of test of some modules..
12:26
g3gg0
ok
12:26
supragya
why do you ask
12:26
ArunM
joined the channel
12:26
g3gg0
was confused because i sent you the MLVFS link and you replied you made an avi frameserver. i thought you put it into MLVFS :D
12:27
supragya
:P
12:32
g3gg0
btw i liked your proposal. but it made to me more sense to define the file format first
12:33
supragya
thanks g3gg0, I think I would be going for now... some household errands... will get in touch if find something else
12:33
ArunM
left the channel
12:34
g3gg0
sure. same here. lawn mowing in the sun. know better things to do
12:34
g3gg0
things that don't involve sun
12:34
BAndiT1983
like home office?
12:35
BAndiT1983
you can be glad you have sun, currently dark clouds are approaching frankfurt
12:35
supragya
seems like BAndiT1983 is in trouble! storm approaching XD
12:36
g3gg0
home office. oh well, i said i wont work today. but still i checked and replied mails.
12:36
BAndiT1983
had to opt to home office, but it's a bit tedious as i'm wating mostly for the PC to compile or to do other stuff, not much coding because of that
12:37
g3gg0
for which target?
12:37
BAndiT1983
when it's over then i will sit down on the balcony and do some stuff for opencine, maybe also some work on MLV loader, so i can playback stuff finally
12:37
supragya
some age old technology he told me
12:37
BAndiT1983
it's GWT (google web toolkit) and java server
12:38
BAndiT1983
nothing exciting
12:38
g3gg0
oh i expected 6502 now
12:38
ArunM
joined the channel
12:38
ArunM
left the channel
12:38
g3gg0
or 6510
12:39
BAndiT1983
6502 would be faster, than this bloated java junk, i know java can be fast and so on, but people tend to write overencumbered code a lot
12:39
g3gg0
yeah. signed bytes. yay
12:40
ArunM
joined the channel
12:40
supragya
[you can be glad you have sun, currently dark clouds are approaching frankfurt] <- I am not glad, and I have sun outside... 45C /113F
12:41
g3gg0
haha win
12:41
BAndiT1983
that's why i'm not looking forward to dubai
12:41
BAndiT1983
there is also around 45-50°C
12:42
supragya
last day my skin turned red... when I went to Indore
12:42
supragya
there had to roam a lot
12:42
g3gg0
cant you prepare sous vide at these temperatures?
12:42
BAndiT1983
funny name, reminds me of indoor ;)
12:42
g3gg0
;)
12:42
supragya
I guessed this may come up
12:43
BAndiT1983
just look for a black car and dump some eggs on the trunk, a couple of minutes and you have a meal
12:43
g3gg0
as long you remove the insects before, why not
12:43
BAndiT1983
pffff, why should one, proteins all the way
12:43
g3gg0
;D
12:43
supragya
better yet, get whey
12:44
g3gg0
okay nice chat. gonna leave now
12:44
g3gg0
have a nice day!
12:44
BAndiT1983
you too
12:45
supragya
same
12:46
supragya
left the channel
12:48
ArunM
left the channel
13:05
BAndiT1983
changed nick to: BAndiT1983|away
13:28
BAndiT1983|away
changed nick to: BAndiT1983
14:37
BogdanSorlea
joined the channel
14:44
BogdanSorlea
I have a small question about "4K HDMI Plugin Module (1x slot)" (status: not started): compared to the finalised "1x HDMI Plugin Module (1x slot)", does the aforementioned one (4K) require an actual different extension board (so new PCB/electronics design), or would it work with the finalised HDMI module (1080p), the only difference being that it w
14:44
BogdanSorlea
ould require a different architecture/design running on the FPGA and or CPU - so basically just different code running in the devices on the camera(?)
14:44
BogdanSorlea
(I'm mostly curious right now)
14:46
se6astian
hi BogdanSorlea
14:47
se6astian
the 4k hdmi module will require a completely new PCB with FPGA on the module itself
14:47
se6astian
similar to the SDI plugin module
14:47
se6astian
the microzed fpga is not providing fast enough IO for 4K HDMI
14:49
ArunM
joined the channel
14:49
BogdanSorlea
ah, I see
14:50
ArunM
left the channel
14:59
danieel
se6astian: advice: dont spec 4K, but rather 4K30 or 4K60 because those have quite different needs (or better would be to say HDMI1.4 vs HDMI2.0)
15:19
ArunM
joined the channel
15:26
intrac
joined the channel
15:43
se6astian
changed nick to: se6astian|away
15:44
BogdanSorlea
left the channel
15:46
Kjetil
4k120 :)
15:58
ArunM
left the channel
15:59
BAndiT1983
changed nick to: BAndiT1983|away
16:00
RexOrCine
changed nick to: RexOrCine|away
16:32
ArunM
joined the channel
16:42
RexOrCine|away
changed nick to: RexOrCine
16:43
supragya
joined the channel
16:47
BAndiT1983|away
changed nick to: BAndiT1983
17:01
seaman_
joined the channel
17:02
supragya
left the channel
17:28
ArunM
left the channel
17:29
ArunM
joined the channel
17:29
ArunM
left the channel
17:29
ArunM
joined the channel
17:34
ArunM
left the channel
17:35
ArunM
joined the channel
17:39
supragya
joined the channel
17:45
se6astian|away
changed nick to: se6astian
17:52
ArunM
left the channel
17:57
ArunM
joined the channel
18:06
supragya
left the channel
18:12
TofuLynx
joined the channel
18:12
TofuLynx
Hello Everyone!
18:13
TofuLynx
Thanks supragya ! :D
18:13
TofuLynx
And, BAndiT1983, no I didn't drink and code xD
18:30
BAndiT1983
hi TofuLynx
18:30
BAndiT1983
yep, seems like repo is still okay ;)
18:30
BAndiT1983
btw. have you tried to activate travis ci for your fork yet?
18:31
BAndiT1983
added dummy unit test and let it fail on purpose, haven't investigated yet how to build separate jobs on travis ci with separate badges, so we can see if just tests or also build fails
18:32
BAndiT1983
currently trying to implement a unit test for RGB extraction, but it requires a bit of restructuring, which is not a bad thing
18:32
TofuLynx
not yet!
18:32
BAndiT1983
and also memory pool testing was started, but it's not building correctly yet
18:33
TofuLynx
I'm currently investigating how many threads are used for the even and odd rows
18:33
TofuLynx
in the original cycles
18:35
TofuLynx
memory pool = pool allocator?
18:38
BAndiT1983
don't get stuck on optimization, push it to later phase
18:38
BAndiT1983
yep, memory pool is pool allocator in that case
18:40
TofuLynx
hmm
18:40
TofuLynx
std::thread can "give" more than one system thread to the function?
18:40
BAndiT1983
don'T think so, but maybe my knowledge is not up to date
18:43
TofuLynx
So far my research it doesnt seem to
18:43
TofuLynx
I will postpone this, I guess
18:43
TofuLynx
Have to avoid premature optimization
18:43
TofuLynx
I think I will commit the old loops, what do you think?
18:46
BAndiT1983
you can do if you want, currently it's not that important and can be postponed, soem unit tests will show us what to expect there
18:46
g3gg0
left the channel
18:47
BAndiT1983
don't be afraid to play around with the code, sometimes it helps to get the picture and to set the things right, even if it looks broken first
18:48
TofuLynx
Ok! :)
18:48
TofuLynx
also, is there a way to add a new class directly into a project on QtCreator?
18:51
BAndiT1983
nope, at least couldn't find one for cmake projects, you have to open the folder by right-clicking the entry and add new file thre
18:51
BAndiT1983
*there
18:51
TofuLynx
ok! :)
18:51
BAndiT1983
same here -> http://www.qtcentre.org/threads/30857-Adding-files-to-cmake-project
18:52
TofuLynx
I see
18:52
BAndiT1983
partially i'm doing stuff also in vscode with cmake plugin, so i can build from there also, and of course adding files or folders works from the GUI
18:53
TofuLynx
yeah!
19:09
Kjetil
Do you really need to have a CMake-plugin? The project files are^Wshould be quite easy to change
19:11
BAndiT1983
it's for a quick build from vscode, so i don't have to change forth and back, also debugging was working nicely
19:13
BAndiT1983
so, off for short time, will be back later
19:13
BAndiT1983
changed nick to: BAndiT1983|away
19:30
TofuLynx
left the channel
19:59
TofuLynx
joined the channel
19:59
TofuLynx
Back
19:59
TofuLynx
went dinner
20:06
BAndiT1983|away
changed nick to: BAndiT1983
20:23
TofuLynx
BAndiT1983: You were planing to add openMP to OC?
20:26
BAndiT1983
openmp tests were already there, but performance-wise not much gain, maybe new tests should be done
20:26
TofuLynx
Ok :)
20:33
BAndiT1983
but still, please focus on functionality, performance is not important first, also compiler optimizations are still not activated, like -O3
20:33
TofuLynx
Ok!
20:34
TofuLynx
I'm now creating the ProcessRed method of DebayerBilinear Class
20:34
TofuLynx
I'm thinking about how I'll do the loop
20:36
TofuLynx
the existing bilinear class has a glitch in the borders,
20:36
TofuLynx
I think I have a fast and easy solution for it
20:36
TofuLynx
First I would do a loop that iterates to every pixel that is not in the border
20:37
TofuLynx
and only then I would do a loop that does nearest interpolation in the border, avoiding glitches
20:37
TofuLynx
However, I'll postpone that for later. First a basic bilinear debayer
20:46
intrac
left the channel
20:46
intrac_
joined the channel
20:49
RexOrCine
changed nick to: RexOrCine|away
20:49
intrac
joined the channel
20:51
intrac_
left the channel
21:18
ArunM
left the channel
21:50
se6astian
changed nick to: se6astian|away
21:56
TofuLynx
BAndiT1983: the class structure is finished I think, I just have to finish the processingRGB methods
22:03
BAndiT1983
ok, looking forward to it, there will be some changes from my side soon too, but first the unit test has to be finished
22:20
TofuLynx
left the channel
22:22
slikdigit
joined the channel
22:29
TofuLynx
joined the channel
22:29
TofuLynx
Off for today
22:29
TofuLynx
Good night everyone!
22:30
TofuLynx
left the channel
22:45
BAndiT1983
changed nick to: BAndiT1983|away
22:56
slikdigit
left the channel
22:56
slikdigit
joined the channel
23:05
slikdigit
left the channel
23:06
slikdigit
joined the channel