Current Server Time: 23:55 (Central Europe)

#apertus IRC Channel Logs

2022/10/10

Timezone: UTC


01:23
intrac
left the channel
01:29
intrac
joined the channel
07:25
se6astian
cool, yes I will dig them out for you
07:27
se6astian
https://cloud.apertus.org/index.php/s/xZZDjmfwBEsinid
07:27
se6astian
morrigan still did not fix the ssl cert error unfortunately
07:28
se6astian
but if you press "I accept the risk..."
09:01
se6astian
ssl should be fixed
11:18
anuejn
se6astian: do you have that data also in machiene readable form?
11:18
anuejn
e.g. as csv?
11:18
anuejn
it is quite cumbersome to extract that from the graphs
11:25
Bertl
https://automeris.io/WebPlotDigitizer/
11:44
se6astian
I dont, sorry
11:47
vup
anuejn: interesting, why the cutoff at 700nm?
11:49
anuejn
vup: the cutoff is at 730nm because the data for the color checker reflexivity only goes to that place
11:49
vup
interesting
11:49
vup
that seems pretty low
14:53
vup
anuejn: se6astian: this is what you get if you use the matrix obtained this way to actually reproduce the color checker:
14:53
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A.pdf
14:53
vup
https://f.coroot.de/cmv12k_color_response/color_checker_D65.pdf
14:54
vup
(I think there is a small typo in the upstream notebook, I uploaded the one I used to generate these here: https://f.coroot.de/cmv12k_color_response/color_matrices.ipynb)
14:55
vup
it seems like a simple matrix based calibration / correction will not really work / be a bit unsatisfactory
15:56
anuejn
interesting
15:56
se6astian
this is generated purely from the image sensor response curve + uv/ir filter spectral curve?
16:00
vup
se6astian: purely from image sensor response curve (no filter spectral curve)
16:01
anuejn
though a hard cutoff at 380nm and 730nm is assumed
16:11
se6astian
not bad!
16:16
vup
(but a cutoff at these wavelengths is "standard" according to wikipedia)
16:19
se6astian
I picked up the IT8 chart plus FilmGradeâ„¢ WHITE LED Strip Lights from waveform lighting
16:20
se6astian
will forward to max soon so he can capture more samples
16:22
se6astian
https://store.waveformlighting.com/products/filmgrade-led-strip-lights-for-film-photography
17:00
se6astian
MEETING TIME, who is here?
17:00
Bertl
is here ...
17:02
se6astian
any news Bertl?
17:03
se6astian
do you want to briefly summarize the mustafa meeting?
17:03
Bertl
yes, I have some news to share (besides the meeting with mustafa_ :)
17:04
BAndiT1983_
is here
17:05
Bertl
the LAM/INT project is currently hitting the hot/final phase, and I've been spending some quality time to get everything working there
17:05
Bertl
among other things, this means that we now have successful full and cut-out image transfers via dual HDMI capture via Magewell PCIe cards
17:05
se6astian
great!
17:07
Bertl
the sequencer is currently operating in pure userspace, utilizing the user io generic devices to receive interrupts
17:07
Bertl
it is a simple python implementation for now but should probably be moved to C or Rust in the future and also made a realtime process
17:07
Bertl
the capture solution is also python based and uses the Magewell SDK
17:08
Bertl
at least on the 4 port HDMI card we do not have any problems capturing the full bandwidth we can output on the Axiom Beta
17:09
Bertl
there has been some work regarding binning and sending two half-frames as well, but that isn't working as intended yet
17:09
Bertl
that said, I'm still hacking on it and I'm kind of optimistic that we will get that working this week
17:10
Bertl
now for the meeting with mustafa_, let's wait a little and see if he is around, maybe he wants to report himself?
17:11
se6astian
right
17:12
Bertl
that's it from my side for this week
17:13
se6astian
is part of the LAM project to also document the solution and prepare the tools for github?
17:13
se6astian
or do "we" need to cover that?
17:13
Bertl
I'm pretty sure we need to cover that
17:13
se6astian
ok
17:14
mustafa_
Hi guys, I have started working on the new MYIR module integration for the sensor, I am preparing the first draft pin mapping of differential pin mapping to connector pins at the moment
17:14
se6astian
would be great if you can create the foundation for that please, as you are currently the only one with access to the pcie card multi connection setup I assume
17:14
se6astian
great to ahve you here mustafa
17:16
mustafa_
I have got all the documentation from Bertl and I have managed to import the connector locations onto a board as well, managed to use development board footprint, so mechanical alignment between the MYIR module and new development module should be fine
17:16
Bertl
se6astian: yeah, will try to take care of that once the LAM/INT project is finished
17:17
mustafa_
i am hoping to finish the sensor pin mapping to module board and start adding the interface connectors
17:19
se6astian
for context: mustafa_ is designing a baseboard or carrier board for the MYIR ultrascale family with image senor board connectors as well as high speed plugin module connectors, to allow us to evaluate the xilinx ultrascale as next gen platform with actual data input and output streams as we use them
17:20
Bertl
where MYiR is the FPGA board manufacturer and the Ultrascale comes from Xilinx/AMD
17:20
se6astian
yes http://www.myirtech.com/list.asp?id=612
17:20
Bertl
sidenote here: we are also looking into the KRIA platform and boards from Xilinx
17:21
BAndiT1983_
thanks, was just about to ask which base board is being used
17:22
Bertl
personally I also would call the board mustafa_ is designing an Evaluation Board, but Carrier Board is probably a good choice as well
17:24
se6astian
BAndiT1983_ do you also have some news for us?
17:24
Bertl
in any case, very interesting project and great that mustafa_ is working on it with such enthusiasm!
17:24
se6astian
vup/anuejn I see lots of progress from you here and on github, can you give us a brief summary as well?
17:26
BAndiT1983_
yep, a bit, trying to push some tool development forward, mostly raw12viewer, which is mostly fixed now, but started to consider maybe pyqt as alternative to pysimplegui, as the former one allows more sophisticated event handling and QGraphicsView is an awesome canvas full of features
17:26
BAndiT1983_
am using our PCB paneliser script as playground for pyqt too, for a WYSIWYG UI, is coming along nicely too, will show screenshots next week maybe, if nothing comes in-between again
17:27
BAndiT1983_
as last thing, don't remember if i've reported it some weeks ago or not, SAME70 USB was done and working fine, next target was the bootloader and self-flashing, but some private things had to be taken care of, so this one is hibernating slightly, but not forgotten and will be continued soon
17:28
BAndiT1983_
*regarding the size of pyqt vs pysimplegui, former was smaller when packed with pyinstaller into single file, but even the folder with all dependencies was much smaller too
17:28
BAndiT1983_
that's it for now
17:29
se6astian
many thanks BAndiT1983_!
17:29
vup
is here
17:29
se6astian
quick updates from my side: handed over first colorchart to max last week and picked up more charts and high CRI LED lights today to hand over to max soon s he can capture more snaps for ryan
17:29
se6astian
great vup, please go on!
17:30
vup
well mostly anuejn has been working on some color science stuff, I just played around with it a bit aswell
17:31
vup
the basic idea is to take the spectral response of the sensor given in the datasheet to simulate images of a color checker
17:31
vup
(which obviously depends on the lighting / illuminant setup)
17:34
vup
one can then try to find a transformation that brings the simulated sensor response for the different color patches to the "expected" color values for the color patches
17:35
vup
we started of using a simple linear transformation (ie a 3x3 matrix) that minimizes the sum of quadratic distances between the resulting transformed sensor pixel values and the expected XYZ color values for the color patches
17:35
vup
in other words: a linear transformation that brings the raw camera pixel values into XYZ color space
17:36
vup
for two different illuminants one can see a comparison between the "correct" color and the color we can obtain from the sensor using such a linear transform here:
17:36
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A.pdf
17:36
vup
https://f.coroot.de/cmv12k_color_response/color_checker_D65.pdf
17:36
vup
now there are different directions one can go when trying to improve this
17:38
vup
for one the euclidian distance in XYZ color space doesn't really map well to preceptual difference, so instead of optimizing this in XYZ space one could choose for example Luv so that hopefully the "color differences" that are optimized match more closely the human perception
17:38
vup
this is done here for this comparison:
17:38
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A_luv.pdf
17:39
vup
furthermore one can look into transformations that are nonlinear
17:39
vup
especially those ones that still preserve exposure independence
17:40
vup
using the one described in this paper: https://ieeexplore.ieee.org/document/7047834
17:40
vup
improves matters a bit more
17:41
vup
https://f.coroot.de/cmv12k_color_response/color_checker_A_finlayson.pdf
17:42
vup
note that this is was just some basic playing around, so there might still be some errors in these plots
17:43
vup
also, the more "complex" the transform the more parameters there are, so this could also just be overfitting already
17:44
vup
furthermore, it would be interesting to actually use these color transformations for a real image an check if they actually work for those
17:44
vup
anyways, thats it for now
17:46
se6astian
very nice progress!
17:46
se6astian
anyone else who wants to share/report anything?
17:50
se6astian
ok then, many thanks everyone, very happy to see so many things happening in parallel! way to go! MEETING CONCLUDED
17:59
Bertl
thanks!
18:58
vup
se6astian: does max have any lights / illuminants with know spectrum?
19:08
se6astian
yes, tungsten lights
19:25
vup
nice
19:26
vup
se6astian: any color checker images with those yet?
19:52
se6astian
not yet
20:13
anuejn
some additions from my side: I did develop the clock timelapse sequence with the derived transformations and davinci resole and they looked "correct" to me
20:14
anuejn
which is a major improvement compared to the current color matrix (stolen from a1ex's raw2dng) which does not look "correct"
20:15
anuejn
then me and vup worked some on the recorder last week so that we can have processing pipelines with caching and without eating up all your ram
20:16
anuejn
this seems to work good and we will hopefully merge this soonish
20:24
se6astian
sounds great!
20:28
anuejn
Bertl, mustafa_: what made you choose the myir board over the kira one for that prototype?
20:28
anuejn
is asking because he already owns the kira board
20:29
anuejn
se6astian: can I publish that pdf to github or would you not like that?
20:29
se6astian
sure feel free to push/publish
20:30
se6astian
we want to evalaute both kira and myir
20:30
anuejn
ah nice
20:30
se6astian
but IIRC the main concern with the Kira was the lack of IO for all sensor LVDS lanes
20:30
anuejn
I see
20:30
anuejn
If you have a spare kira prototype board I would be happy to test it :D
20:33
se6astian
from oshpark we will get 3 copies anyways
20:33
se6astian
if you bring bertl a cake he might assemble 3 :)
20:33
se6astian
now that you can physically actually deliver cake :D
20:35
anuejn
:P
20:35
anuejn
for that board I would actually make some really good cake
20:35
anuejn
but I fear Bertl would need to fetch it ;)
20:39
se6astian
I am sure we can arrange something!
20:45
Bertl
anuejn: well let's hope the cake isn't a lie! :)
20:45
anuejn
lets hope the kira board isnt a lie ;)
20:46
Bertl
and yes, the main reason to go with the MYiR first is that we should not have any trouble with full sensor board connections
20:46
Bertl
there might still be a chance to get 64+3 LVDS working on the KIRA board, but not as simple
20:48
anuejn
i see
20:48
anuejn
that would indeed be a bit sad
20:48
anuejn
on what does it depend if that is possible?
20:59
androuser10
joined the channel
21:00
androuser10
what kind of movie promotion have you come across which fall in gray area (not ethical or unethical)?
21:06
anuejn
movie promotion?
21:09
androuser10
cast n crew talk about movie before movie release
21:10
androuser10
cast n crew talk about movie on tv before movie release
21:10
androuser10
?
21:13
androuser10
cast n crew talk about movie on tv before movie release to create interest
21:25
anuejn
hm... no i do not have any buiseness with that
21:45
Bertl
anuejn: from the spec, we have 42 + 16 LVDS on the headers, which means we are missing 6+3 LVDS pairs for a full sensor connection
21:46
Bertl
now we have a number of other GPIOs which might be suitable for the task if used properly, but we have to investigate this first
21:51
se6astian
left the channel
21:52
vup
left the channel
21:52
anuejn
left the channel
21:55
bluez
left the channel
21:56
BAndiT1983_
left the channel
21:56
eppisai
left the channel
21:56
polyrhythm
left the channel
21:59
androuser10
left the channel
22:41
mustafa_ug_
joined the channel
22:44
mustafa_
changed nick to: mustafa__
22:45
mustafa_
joined the channel
22:46
mustafa_ug_
left the channel
22:46
mustafa__
left the channel
22:51
bluez
joined the channel
22:51
BAndiT1983
joined the channel
22:51
vup
joined the channel
22:51
eppisai
joined the channel
22:51
se6astian
joined the channel
22:51
anuejn
joined the channel
22:51
polyrhythm
joined the channel
23:29
anuejn
se6astian: do you have links for each of the color filters tested?
23:33
Bertl
anuejn: did some investigations regarding K26 SOM and it doesn't look good for full sensor connection (at least for the CMV12k)
23:34
Bertl
there are only 58 HP LVDS pairs and while the HD GPIOs can be used for LVDS (with external termination) they stop at 250M data rates
23:34
Bertl
and HD also does not provide any delay elements or buffers
23:35
Bertl
so the only option would be to use some gearwork between the K26 and the sensor
00:28
aombk2
joined the channel
00:29
aombk
left the channel