Current Server Time: 06:01 (Central Europe)

#apertus IRC Channel Logs

2022/10/10

Timezone: UTC


00:23
intrac
left the channel
00:29
intrac
joined the channel
06:25
se6astian
cool, yes I will dig them out for you
06:27
se6astian
https://cloud.apertus.org/index.php/s/xZZDjmfwBEsinid
06:27
se6astian
morrigan still did not fix the ssl cert error unfortunately
06:28
se6astian
but if you press "I accept the risk..."
08:01
se6astian
ssl should be fixed
10:18
anuejn
se6astian: do you have that data also in machiene readable form?
10:18
anuejn
e.g. as csv?
10:18
anuejn
it is quite cumbersome to extract that from the graphs
10:25
Bertl
https://automeris.io/WebPlotDigitizer/
10:44
se6astian
I dont, sorry
10:47
vup
anuejn: interesting, why the cutoff at 700nm?
10:49
anuejn
vup: the cutoff is at 730nm because the data for the color checker reflexivity only goes to that place
10:49
vup
interesting
10:49
vup
that seems pretty low
13:53
vup
anuejn: se6astian: this is what you get if you use the matrix obtained this way to actually reproduce the color checker:
13:53
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A.pdf
13:53
vup
https://f.coroot.de/cmv12k_color_response/color_checker_D65.pdf
13:54
vup
(I think there is a small typo in the upstream notebook, I uploaded the one I used to generate these here: https://f.coroot.de/cmv12k_color_response/color_matrices.ipynb)
13:55
vup
it seems like a simple matrix based calibration / correction will not really work / be a bit unsatisfactory
14:56
anuejn
interesting
14:56
se6astian
this is generated purely from the image sensor response curve + uv/ir filter spectral curve?
15:00
vup
se6astian: purely from image sensor response curve (no filter spectral curve)
15:01
anuejn
though a hard cutoff at 380nm and 730nm is assumed
15:11
se6astian
not bad!
15:16
vup
(but a cutoff at these wavelengths is "standard" according to wikipedia)
15:19
se6astian
I picked up the IT8 chart plus FilmGradeâ„¢ WHITE LED Strip Lights from waveform lighting
15:20
se6astian
will forward to max soon so he can capture more samples
15:22
se6astian
https://store.waveformlighting.com/products/filmgrade-led-strip-lights-for-film-photography
16:00
se6astian
MEETING TIME, who is here?
16:00
Bertl
is here ...
16:02
se6astian
any news Bertl?
16:03
se6astian
do you want to briefly summarize the mustafa meeting?
16:03
Bertl
yes, I have some news to share (besides the meeting with mustafa_ :)
16:04
BAndiT1983_
is here
16:05
Bertl
the LAM/INT project is currently hitting the hot/final phase, and I've been spending some quality time to get everything working there
16:05
Bertl
among other things, this means that we now have successful full and cut-out image transfers via dual HDMI capture via Magewell PCIe cards
16:05
se6astian
great!
16:07
Bertl
the sequencer is currently operating in pure userspace, utilizing the user io generic devices to receive interrupts
16:07
Bertl
it is a simple python implementation for now but should probably be moved to C or Rust in the future and also made a realtime process
16:07
Bertl
the capture solution is also python based and uses the Magewell SDK
16:08
Bertl
at least on the 4 port HDMI card we do not have any problems capturing the full bandwidth we can output on the Axiom Beta
16:09
Bertl
there has been some work regarding binning and sending two half-frames as well, but that isn't working as intended yet
16:09
Bertl
that said, I'm still hacking on it and I'm kind of optimistic that we will get that working this week
16:10
Bertl
now for the meeting with mustafa_, let's wait a little and see if he is around, maybe he wants to report himself?
16:11
se6astian
right
16:12
Bertl
that's it from my side for this week
16:13
se6astian
is part of the LAM project to also document the solution and prepare the tools for github?
16:13
se6astian
or do "we" need to cover that?
16:13
Bertl
I'm pretty sure we need to cover that
16:13
se6astian
ok
16:14
mustafa_
Hi guys, I have started working on the new MYIR module integration for the sensor, I am preparing the first draft pin mapping of differential pin mapping to connector pins at the moment
16:14
se6astian
would be great if you can create the foundation for that please, as you are currently the only one with access to the pcie card multi connection setup I assume
16:14
se6astian
great to ahve you here mustafa
16:16
mustafa_
I have got all the documentation from Bertl and I have managed to import the connector locations onto a board as well, managed to use development board footprint, so mechanical alignment between the MYIR module and new development module should be fine
16:16
Bertl
se6astian: yeah, will try to take care of that once the LAM/INT project is finished
16:17
mustafa_
i am hoping to finish the sensor pin mapping to module board and start adding the interface connectors
16:19
se6astian
for context: mustafa_ is designing a baseboard or carrier board for the MYIR ultrascale family with image senor board connectors as well as high speed plugin module connectors, to allow us to evaluate the xilinx ultrascale as next gen platform with actual data input and output streams as we use them
16:20
Bertl
where MYiR is the FPGA board manufacturer and the Ultrascale comes from Xilinx/AMD
16:20
se6astian
yes http://www.myirtech.com/list.asp?id=612
16:20
Bertl
sidenote here: we are also looking into the KRIA platform and boards from Xilinx
16:21
BAndiT1983_
thanks, was just about to ask which base board is being used
16:22
Bertl
personally I also would call the board mustafa_ is designing an Evaluation Board, but Carrier Board is probably a good choice as well
16:24
se6astian
BAndiT1983_ do you also have some news for us?
16:24
Bertl
in any case, very interesting project and great that mustafa_ is working on it with such enthusiasm!
16:24
se6astian
vup/anuejn I see lots of progress from you here and on github, can you give us a brief summary as well?
16:26
BAndiT1983_
yep, a bit, trying to push some tool development forward, mostly raw12viewer, which is mostly fixed now, but started to consider maybe pyqt as alternative to pysimplegui, as the former one allows more sophisticated event handling and QGraphicsView is an awesome canvas full of features
16:26
BAndiT1983_
am using our PCB paneliser script as playground for pyqt too, for a WYSIWYG UI, is coming along nicely too, will show screenshots next week maybe, if nothing comes in-between again
16:27
BAndiT1983_
as last thing, don't remember if i've reported it some weeks ago or not, SAME70 USB was done and working fine, next target was the bootloader and self-flashing, but some private things had to be taken care of, so this one is hibernating slightly, but not forgotten and will be continued soon
16:28
BAndiT1983_
*regarding the size of pyqt vs pysimplegui, former was smaller when packed with pyinstaller into single file, but even the folder with all dependencies was much smaller too
16:28
BAndiT1983_
that's it for now
16:29
se6astian
many thanks BAndiT1983_!
16:29
vup
is here
16:29
se6astian
quick updates from my side: handed over first colorchart to max last week and picked up more charts and high CRI LED lights today to hand over to max soon s he can capture more snaps for ryan
16:29
se6astian
great vup, please go on!
16:30
vup
well mostly anuejn has been working on some color science stuff, I just played around with it a bit aswell
16:31
vup
the basic idea is to take the spectral response of the sensor given in the datasheet to simulate images of a color checker
16:31
vup
(which obviously depends on the lighting / illuminant setup)
16:34
vup
one can then try to find a transformation that brings the simulated sensor response for the different color patches to the "expected" color values for the color patches
16:35
vup
we started of using a simple linear transformation (ie a 3x3 matrix) that minimizes the sum of quadratic distances between the resulting transformed sensor pixel values and the expected XYZ color values for the color patches
16:35
vup
in other words: a linear transformation that brings the raw camera pixel values into XYZ color space
16:36
vup
for two different illuminants one can see a comparison between the "correct" color and the color we can obtain from the sensor using such a linear transform here:
16:36
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A.pdf
16:36
vup
https://f.coroot.de/cmv12k_color_response/color_checker_D65.pdf
16:36
vup
now there are different directions one can go when trying to improve this
16:38
vup
for one the euclidian distance in XYZ color space doesn't really map well to preceptual difference, so instead of optimizing this in XYZ space one could choose for example Luv so that hopefully the "color differences" that are optimized match more closely the human perception
16:38
vup
this is done here for this comparison:
16:38
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A_luv.pdf
16:39
vup
furthermore one can look into transformations that are nonlinear
16:39
vup
especially those ones that still preserve exposure independence
16:40
vup
using the one described in this paper: https://ieeexplore.ieee.org/document/7047834
16:40
vup
improves matters a bit more
16:41
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A_finlayson.pdf
16:42
vup
note that this is was just some basic playing around, so there might still be some errors in these plots
16:43
vup
also, the more "complex" the transform the more parameters there are, so this could also just be overfitting already
16:44
vup
furthermore, it would be interesting to actually use these color transformations for a real image an check if they actually work for those
16:44
vup
anyways, thats it for now
16:46
se6astian
very nice progress!
16:46
se6astian
anyone else who wants to share/report anything?
16:50
se6astian
ok then, many thanks everyone, very happy to see so many things happening in parallel! way to go! MEETING CONCLUDED
16:59
Bertl
thanks!
17:58
vup
se6astian: does max have any lights / illuminants with know spectrum?
18:08
se6astian
yes, tungsten lights
18:25
vup
nice
18:26
vup
se6astian: any color checker images with those yet?
18:52
se6astian
not yet
19:13
anuejn
some additions from my side: I did develop the clock timelapse sequence with the derived transformations and davinci resole and they looked "correct" to me
19:14
anuejn
which is a major improvement compared to the current color matrix (stolen from a1ex's raw2dng) which does not look "correct"
19:15
anuejn
then me and vup worked some on the recorder last week so that we can have processing pipelines with caching and without eating up all your ram
19:16
anuejn
this seems to work good and we will hopefully merge this soonish
19:24
se6astian
sounds great!
19:28
anuejn
Bertl, mustafa_: what made you choose the myir board over the kira one for that prototype?
19:28
anuejn
is asking because he already owns the kira board
19:29
anuejn
se6astian: can I publish that pdf to github or would you not like that?
19:29
se6astian
sure feel free to push/publish
19:30
se6astian
we want to evalaute both kira and myir
19:30
anuejn
ah nice
19:30
se6astian
but IIRC the main concern with the Kira was the lack of IO for all sensor LVDS lanes
19:30
anuejn
I see
19:30
anuejn
If you have a spare kira prototype board I would be happy to test it :D
19:33
se6astian
from oshpark we will get 3 copies anyways
19:33
se6astian
if you bring bertl a cake he might assemble 3 :)
19:33
se6astian
now that you can physically actually deliver cake :D
19:35
anuejn
:P
19:35
anuejn
for that board I would actually make some really good cake
19:35
anuejn
but I fear Bertl would need to fetch it ;)
19:39
se6astian
I am sure we can arrange something!
19:45
Bertl
anuejn: well let's hope the cake isn't a lie! :)
19:45
anuejn
lets hope the kira board isnt a lie ;)
19:46
Bertl
and yes, the main reason to go with the MYiR first is that we should not have any trouble with full sensor board connections
19:46
Bertl
there might still be a chance to get 64+3 LVDS working on the KIRA board, but not as simple
19:48
anuejn
i see
19:48
anuejn
that would indeed be a bit sad
19:48
anuejn
on what does it depend if that is possible?
19:59
androuser10
joined the channel
20:00
androuser10
what kind of movie promotion have you come across which fall in gray area (not ethical or unethical)?
20:06
anuejn
movie promotion?
20:09
androuser10
cast n crew talk about movie before movie release
20:10
androuser10
cast n crew talk about movie on tv before movie release
20:10
androuser10
?
20:13
androuser10
cast n crew talk about movie on tv before movie release to create interest
20:25
anuejn
hm... no i do not have any buiseness with that
20:45
Bertl
anuejn: from the spec, we have 42 + 16 LVDS on the headers, which means we are missing 6+3 LVDS pairs for a full sensor connection
20:46
Bertl
now we have a number of other GPIOs which might be suitable for the task if used properly, but we have to investigate this first
20:51
se6astian
left the channel
20:52
vup
left the channel
20:52
anuejn
left the channel
20:55
bluez
left the channel
20:56
BAndiT1983_
left the channel
20:56
eppisai
left the channel
20:56
polyrhythm
left the channel
20:59
androuser10
left the channel
21:41
mustafa_ug_
joined the channel
21:44
mustafa_
changed nick to: mustafa__
21:45
mustafa_
joined the channel
21:46
mustafa_ug_
left the channel
21:46
mustafa__
left the channel
21:51
bluez
joined the channel
21:51
BAndiT1983
joined the channel
21:51
vup
joined the channel
21:51
eppisai
joined the channel
21:51
se6astian
joined the channel
21:51
anuejn
joined the channel
21:51
polyrhythm
joined the channel
22:29
anuejn
se6astian: do you have links for each of the color filters tested?
22:33
Bertl
anuejn: did some investigations regarding K26 SOM and it doesn't look good for full sensor connection (at least for the CMV12k)
22:34
Bertl
there are only 58 HP LVDS pairs and while the HD GPIOs can be used for LVDS (with external termination) they stop at 250M data rates
22:34
Bertl
and HD also does not provide any delay elements or buffers
22:35
Bertl
so the only option would be to use some gearwork between the K26 and the sensor
23:28
aombk2
joined the channel
23:29
aombk
left the channel