22:00 | Bertl | yep
| |
22:00 | Sasha_C | yeah, I'm also aware of thi
| |
22:00 | Sasha_C | *this
| |
22:00 | Bertl | maybe we should have some kind of 'active axiom alpha development' link on each of those pages?
| |
22:00 | Sasha_C | I think that'd be a pretty good idea
| |
22:00 | Bertl | (something flashing or highlighted :)
| |
22:01 | Sasha_C | But who would maintain this?
| |
22:01 | Bertl | and we should make a central page which has that info
| |
22:01 | Sasha_C | yeah, definitely
| |
22:01 | Bertl | I guess a single page is easy to maintain for sebastian, and who knows, maybe you or philippe wants to write a section there too every now and then
| |
22:02 | Sasha_C | It's like with Gimp. I wish there was a page I could look at on their site that tracked the development progress they're currently making
| |
22:02 | Sasha_C | ok, that could work
| |
22:02 | Bertl | doesn't have to contain much, shouldn't be longer than a few pages (3-4 at most)
| |
22:03 | Bertl | can link to all the nice articles and milestones we already reached
| |
22:03 | Bertl | but it should definitely be updated on a weekly basis (maybe sometimes twice a week if something interresting happens)
| |
22:05 | Bertl | I think that kind of PR is essential for the upcoming crowd funding
| |
22:06 | Bertl | back to the noise/artefacts, can you try to explain to me what you are seeing in e.g. http://vserver.13thfloor.at/Stuff/AXIOM/RAW/colors_500ms.png ?
| |
22:08 | Sasha_C | Ok, when I view the image at full res, and I'm looking at the yellow band (as an example)... I can see a subtle pattern of vertical lines
| |
22:08 | Bertl | I also see vertical and horizontal lines
| |
22:09 | Sasha_C | Yes, forgive me if this isn't sensor pattern noise. I'm not sure what to call it
| |
22:09 | Bertl | for me, actually the horizontal lines are more visible
| |
22:09 | Bertl | it is a pattern which is too aligned to the sensor orientation to be present in the original motive :)
| |
22:10 | Sasha_C | yes, same here, it's just that my IRC window is blocking most of my web browser as I type this, and the vertical lines are easier for me to see
| |
22:10 | Sasha_C | yes, exactly
| |
22:10 | Bertl | but it might be the result of the debayering, we should look at the raw grey images instead
| |
22:11 | Bertl | I also see the 'hot' pixels, which are most likely related to the different discharge times I suspect on a per pixel basis
| |
22:12 | Sasha_C | I wasn't sure if they were hot or dead pixels on the defective sensor
| |
22:12 | Sasha_C | hey, I'm sorry to be rude, but I have to go have a shower now and then head off to work (starting a little later today...)
| |
22:13 | Sasha_C | I'll email Sebastian re: creating a development section on site, updated once to twice per week
| |
22:13 | Bertl | no need to be sorry, we'll talk later ...
| |
22:13 | Sasha_C | ok, thanks. Have a great one!
| |
22:13 | Bertl | thanks, appreciated! you too!
| |
22:40 | Sasha_C | left the channel | |
23:14 | troy_s | Greets all.
| |
23:15 | troy_s | FergusL: Greets. About to shower then I have a few...
| |
23:16 | FergusL | good, troy_s
| |
23:16 | FergusL | evening Bertl
| |
23:16 | FergusL | shall I read the whole backlog ?
| |
23:16 | troy_s | If someone can pastebin I will read when done.
| |
23:17 | Bertl | didn't happen much I'd say
| |
23:19 | FergusL | troy_s: Bertl and I can easily sum up
| |
23:19 | FergusL | basically last night we were looking at the first picture taken by Bertl with the "experimental" setup he has assembled
| |
23:20 | FergusL | and quickly we came to matters which I discussed with you : linearization, corrections made to the image
| |
23:21 | FergusL | heading to the kitchen for a bit
| |
23:32 | FergusL | Bertl: did you continue to investigate about yesterday's issue ?
| |
23:33 | Bertl | which one?
| |
23:35 | FergusL | the fixed stuck pixels
| |
23:41 | troy_s | FergusL: Back
| |
23:41 | FergusL | wb
| |
23:41 | FergusL | and thanks
| |
23:42 | troy_s | FergusL: What in particular regarding linearization?
| |
23:43 | Bertl | well, I suspect the pixels (let's call them 'hot pixels' for now)
| |
23:43 | troy_s | (and have the primaries been mapped for the sensor yet?)
| |
23:43 | troy_s | Bertl: Hrm... I take it this is post debayering?
| |
23:44 | troy_s | Bertl: Or pre?
| |
23:44 | Bertl | FergusL: are the result of an unusually long exposure time and the fact that pixels discharge at different rates
| |
23:44 | Bertl | troy_s: we have pre and post debayering data
| |
23:44 | troy_s | Bertl: Correct. The greens fill first.
| |
23:44 | Bertl | the .raw8/.raw16 are pre debayering
| |
23:45 | troy_s | Bertl: Have you attempted to map the native primaries?
| |
23:45 | Bertl | please define what 'native primaries' means
| |
23:45 | troy_s | Bertl: The native 'colors' of the R G and B channels.
| |
23:46 | troy_s | Are you familiar with the CIE research?
| |
23:46 | Bertl | nope, not at all
| |
23:46 | troy_s | Oh. Good time to learn a tad about it, but I can give you a condensed notes version if you like.
| |
23:46 | FergusL | (say "yes")
| |
23:46 | Bertl | yes, that would be appreciated
| |
23:47 | Bertl | FergusL: I always say yes to information :)
| |
23:47 | troy_s | In 1931 two scientists (Guild and Wright IIRC) concurrently were trying to map the range of average human vision.
| |
23:48 | troy_s | There isn't much data on exactly the age or gender of their samples (likely male and younger), but they developed correlating data sets that matched almost perfectly.
| |
23:48 | troy_s | With me so far?
| |
23:48 | Bertl | yup
| |
23:49 | troy_s | Color is much more than wavelengths of light... it is known as a psychophysical phenomenon - it doesn't exist in the real world but is rather how our sensors interpret the data.
| |
23:49 | troy_s | It is entirely relative to context and many other phenomenon (hence the psycho in the psychophysical term)
| |
23:49 | Bertl | not just the sensors, at some point the brain too
| |
23:50 | troy_s | Exactly.
| |
23:50 | troy_s | The researchers collected data using a very simple test. They took the most saturated versions of tri color light in red, green, and blue and presented swatches in a 2 degree circle to observers
| |
23:51 | Bertl | do you have an url to illustrate this?
| |
23:52 | troy_s | Strictly controlled such that the observers could dial in the three lights to match the swatches. The three primaries - or colors of the lights - were carefully selected such that equal parts added to an achromatic color (white if you will, not that white exists per se)
| |
23:52 | troy_s | With me?
| |
23:52 | Bertl | somewhere on the web, so that I get an idea how they looked like
| |
23:52 | troy_s | There isn't a great description of the experiment.
| |
23:52 | Bertl | okay, no problem
| |
23:53 | troy_s | Only loosely. And I had to actually email Mark Fairchild (_the_ color god) to get some explanation as to what I am about to explain.
| |
23:53 | Bertl | so basically they had to figure out the R,G and B values for a saturated color
| |
23:54 | troy_s | So if you visualize the data as a plot, we could have a triangle with our red, green, and blue primaries at the points. The observers matched the swatches.
| |
23:54 | troy_s | The problem was that there were a good number of colors that the observers could not dial in with the lights - they were beyond the 'gamut' that the three lights could express.
| |
23:54 | Bertl | triangle in what color space?
| |
23:54 | troy_s | (bear with me :))
| |
23:54 | Bertl | okay
| |
23:55 | troy_s | Does this make sense so far?
| |
23:55 | Bertl | to some extend, but I think I know where this is heading to
| |
23:55 | troy_s | So to make the swatches matchable, the researchers added quantities of the primaries to bring them into gamut, making them less saturated.
| |
23:56 | troy_s | Once a match was made, they were able to calculate (because all natural light operates radiometrically linear) where their "original" positions on the chart were.
| |
23:56 | Bertl | reading up on sRGB, this is what you are referring to, no?
| |
23:56 | troy_s | This chart couldn't be named the CIE RGB, as it was beyond the RGB primaries they uses.
| |
23:56 | troy_s | It became known as the 1931 2° XYZ color space.
| |
23:57 | Bertl | http://en.wikipedia.org/wiki/Chromaticity_diagram
| |
23:57 | troy_s | And if you see most horseshoe charts out there, that is a derivative of it known as the xyY chart.
| |
23:57 | troy_s | Exactly!
| |
23:57 | troy_s | That is the CIE xyY chart
| |
23:58 | troy_s | Which, because XYZ is infinitely large (with intensity), is merely a scaled version of XYZ.
| |
23:58 | Bertl | okay, understood, so the primaries are 3 points in that space now, yes?
| |
23:58 | troy_s | That long rambling description is because THAT colorspace, the XYZ and the xyY model, is used to describe all color.
| |
23:58 | troy_s | Exactly.
| |
23:59 | troy_s | So when someone asks "what are the camera's native primaries, they are meaning (generally) the xy coordinates in the xyY chart
| |
23:59 | Bertl | and how would I go about mapping them?
| |
23:59 | troy_s | The unique characteristics of the xyY chart is that the Y represents pure luminance irrespective of color.
| |
00:00 | troy_s | To "describe" a color, a simple xy pair can work.
| |
00:00 | troy_s | You would shoot a controlled chart
| |
00:00 | troy_s | Of known photographic densities
| |
00:01 | troy_s | and compare the results to the known values
| |
00:01 | Bertl | I see, well, we will hopefully get some charts in the near future
| |
00:01 | troy_s | ('profiling' the camera)
| |
00:01 | troy_s | Is the data off the sensor curved? or lineae?
| |
00:01 | Bertl | (not sure those charts are the ones you are referring to, though)
| |
00:01 | troy_s | linear?
| |
00:02 | Bertl | I'd also need a reference (gray?) chart to say that, no?
| |
00:02 | troy_s | (you can use many charts to get a ballpark idea - IT8 or even a Gretag Macbeth chart can give you a rough idea)
| |
00:02 | troy_s | A greyscale chart (included on an IT8 IIRC) will show you the TRC
| |
00:02 | troy_s | (tone response curve) if it is baked into the data.
| |
00:03 | troy_s | Sensors are linear creatures, but sometimes the makers bake a log type curve into the data.
| |
00:03 | troy_s | To make it more agreeable to LDR display.
| |
00:03 | Bertl | but that should be mentioned in the datasheet, no>
| |
00:03 | Bertl | s/>/?
| |
00:04 | troy_s | Is your debayered image darkish (linear) or look sort of normal?
| |
00:04 | troy_s | SHOULD be estimates. Every sensor would deviate more or less depending on quality.
| |
00:04 | FergusL | it looked normal to me
| |
00:05 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/colors_500ms.png
| |
00:05 | troy_s | FergusL: Direct debayered?
| |
00:05 | troy_s | How are you debayering?
| |
00:05 | FergusL | after being debayered, "yes"
| |
00:05 | troy_s | What algorithm?
| |
00:05 | Bertl | it is called 'SIMPLE'
| |
00:05 | troy_s | Because the debayer is likely applying a TRC to the data.
| |
00:06 | Bertl | okay, give me a few seconds to eliminate the debayering
| |
00:06 | Bertl | by just mapping the channels manually
| |
00:06 | troy_s | Would be interesting to see... hard to know from that reference shot.
| |
00:07 | FergusL | "applying a TRC", like "bending" it ?
| |
00:07 | troy_s | Yes.
| |
00:07 | troy_s | Exactly.
| |
00:08 | FergusL | ok
| |
00:08 | troy_s | Of course the bayered image is nothing more than a greyscale, but if you treat those values as raw rgb, the image is probably a linear one. Or it is encoded into a YCbCr signal etc.
| |
00:09 | troy_s | What would be _very_ interesting is a shot of an IT8
| |
00:09 | troy_s | I could sponsor one for the project probably. Wolf makes decent ones.
| |
00:10 | troy_s | http://www.targets.coloraid.de
| |
00:10 | FergusL | Ha, that could be great, we were considering ordering one in some way and sending it to Bertl
| |
00:10 | troy_s | Easier to just get one shipped directly.
| |
00:11 | FergusL | woh
| |
00:11 | FergusL | much more complete than a Macbeth
| |
00:11 | troy_s | C1
| |
00:11 | troy_s | A Macbeth is not ideal for profiling
| |
00:12 | troy_s | as there are only 24 swatches
| |
00:12 | troy_s | and no greyscale
| |
00:12 | troy_s | IT8s are a reference standard. There is another with many more swatches.
| |
00:13 | FergusL | there's something I'm wondering, does it come with very specific settings instructions ? like lighting
| |
00:13 | troy_s | Some known white point film lighting gear like a Kino would be needed as the white point must be known (and checked if possible using a color meter)
| |
00:14 | Bertl | the Macbeth said that lighting doesn't matter (not sure that is true :)
| |
00:14 | troy_s | Expose middle geey
| |
00:14 | Bertl | uploading composed image now
| |
00:14 | troy_s | grey
| |
00:14 | troy_s | it does.
| |
00:14 | troy_s | largely because to compute the matrix (most sensors are uniform and fall apart near the high and low end)
| |
00:14 | troy_s | (spill and noise respectively)
| |
00:15 | troy_s | you _must_ know the color of illuminant.
| |
00:15 | troy_s | but from that test, you can get a reasonably accurate estimate of the raw data gamut and native white point.
| |
00:15 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/colors_500ms_composed.png
| |
00:15 | FergusL | we do have Kinos, at least sebastian has access to some
| |
00:16 | Bertl | this is after separating the channels, throwing away one green channel and combining them to RGB
| |
00:16 | troy_s | Hrm. It looks like it has a TRC applied?!?
| |
00:16 | troy_s | Some processor level TRC
| |
00:16 | FergusL | hmpf
| |
00:17 | troy_s | Odd as hell.
| |
00:17 | troy_s | Bertl: Is there a way to pull lower level data?
| |
00:18 | troy_s | (I am wondering why and how they would apply a TRC to the bayer sensel data.)
| |
00:18 | troy_s | I would also like to know what TRC it is. Are there docs for the sensor? Does it mention log anywhere?
| |
00:18 | FergusL | well, in some word the processor is kind of "ours"
| |
00:18 | Bertl | no, that is from the basic data directly from the ADCs
| |
00:18 | troy_s | (although it sure doesn't look like a log curve)
| |
00:19 | FergusL | did you use some algorithm somewhere else in the pipe ?
| |
00:19 | Bertl | so whatever curve or transformation happens, it should be the ADC profile
| |
00:19 | FergusL | like the "SIMPLE" one
| |
00:19 | Bertl | nope, removed
| |
00:19 | FergusL | that's going to have to be added to the "errata questions" list I believe :)
| |
00:19 | FergusL | Bertl: about the sensor doc, do you have a link to it handy
| |
00:19 | FergusL | ?
| |
00:23 | troy_s | Sorry. Hotel wifi punted me.
| |
00:24 | FergusL | looking for the full specs of the sensor
| |
00:24 | FergusL | hold on
| |
00:25 | troy_s | I suspect then that the ADCs are putting a TRC onto the data for whatever reason.
| |
00:25 | troy_s | Still needs to be accurately mapped so that you can invert it (yuck) to get to a scene referred image (or close to it)
| |
00:26 | troy_s | Very exciting to see data from the sensor however.
| |
00:26 | FergusL | yes
| |
00:26 | FergusL | but agreeing on "yuck"
| |
00:26 | FergusL | I still have hope for something else going on
| |
00:27 | Bertl | sec, it should be on github
| |
00:27 | troy_s | FergusL: It is entirely possible there is a register to poke to get the raw data.
| |
00:28 | Bertl | https://github.com/apertus-open-source-cinema/alpha-hardware/blob/master/Datasheets/datasheet_CMV12000%20v8.pdf
| |
00:28 | troy_s | FergusL: My worry is that (and yes it has happened) that some sensor manufacturer out of China or wherever is applying a transform to yield colors they like or think looks "right"
| |
00:28 | FergusL | absolutely, yes
| |
00:29 | troy_s | FergusL: And to prove my depressing point... read this...
| |
00:30 | troy_s | From the Cinematography mailing list http://pastebin.com/8kr06ckr
| |
00:32 | Bertl | but I guess it should be trivial to check for linearity by taking samples with linear exposure times, no?
| |
00:32 | FergusL | oh god, when was that on CML ?
| |
00:32 | Bertl | lear incresing exposure times that is
| |
00:34 | Bertl | *linear
| |
00:34 | FergusL | hm...
| |
00:34 | FergusL | what would that give ?
| |
00:34 | FergusL | higher values ?
| |
00:34 | FergusL | but bent in the same way
| |
00:34 | FergusL | (my take on it)
| |
00:34 | Bertl | what I mean is, I put up a white piece of paper
| |
00:35 | Bertl | and configure exposure times from e.g. 5ms to 500ms in 5 ms intervals
| |
00:35 | Bertl | then I make a histogram of each sample (from the center area to remove the pixel defects)
| |
00:36 | Bertl | and then I plot the centers of the to be expected bell curve in a linear plot
| |
00:37 | troy_s | LOL. "Spectral response: Not available yet."
| |
00:37 | Bertl | that should show the response curve of the pixels
| |
00:37 | FergusL | I'm not sure if I understand right
| |
00:37 | FergusL | well in fact I think I see
| |
00:38 | troy_s | Bertl: You can probably get a good idea of the curve if you have a series of reflective swatches at half stop increments or quarter stops.
| |
00:38 | Bertl | well, I don't have any swatches atm :)
| |
00:38 | troy_s | It just doesn't look linear (not knowing what the test tape looked like)
| |
00:38 | troy_s | Light meter?
| |
00:38 | Bertl | nope
| |
00:38 | troy_s | Balls.
| |
00:39 | troy_s | LOL
| |
00:39 | Bertl | yes, I juggle :)
| |
00:39 | troy_s | And we don't really know the primaries, so we are seeing an sRGB dump of the RGB channels.
| |
00:39 | troy_s | Did you shoot the pattern?
| |
00:40 | Bertl | I do not shoot, not even pattern :) but yes, I captured the image
| |
00:40 | Bertl | so, any flaws you can find in my reasoning regarding detecting any non linear transfer function?
| |
00:41 | troy_s | The bottom line is that to make the data useful, this will need to be tackled soon. Largely also because the target audience will also find said information very useful.
| |
00:41 | troy_s | Hrm.
| |
00:41 | Bertl | (by linearily increasing the exposure time and capturing a histogram?)
| |
00:42 | troy_s | I am unsure.
| |
00:43 | troy_s | Because we are talking about the response curve in a single image. If we do exposure slices I fear that we will always be grabbing the "middle" value. But I have never tried anything like that so I am simply uncertain.
| |
00:43 | FergusL | what troy_s just said about the "target audience" is very important, I think we're atm discussing points that will have to be put forward in our documentation
| |
00:43 | troy_s | Try it perhaps? If the result is a curve, then the test is perhaps working.
| |
00:43 | Bertl | hehe :)
| |
00:43 | troy_s | The gamut of that sensor is pretty darn important for sure. But the curve is deadly important.
| |
00:43 | Bertl | well, that would be selecting the test to support your argument :)
| |
00:44 | Bertl | i.e. biasing the information ... but I do not see any flaw in the reasoning
| |
00:44 | troy_s | If they are baking some nonstandard curve in merely to present the data in a "nice" fashion, it makes the data useless if we can't reverse it.
| |
00:45 | troy_s | To store as a native DPX on a storage device or even a purely linear EXR for example.
| |
00:45 | Bertl | i.e. if there is a linear relation between light and read out values, then we should get a linear relation between the sampled/mean values for linear exposure steps, no?
| |
00:45 | troy_s | Bertl: Well if it is linear and the image clearly isn't, the test failed. If it is curved, then it likely is working to a degree. Not much room for bias I can see?
| |
00:45 | Bertl | and if it isn't linear, then we should see the curve
| |
00:46 | Bertl | the bias is, that you say it failed if it doesn't support your assumption, and it might have worked if it does
| |
00:46 | Bertl | that is like when I say, let's plot this data to see if it is linear
| |
00:46 | troy_s | And given that you don't have the means to do a proper chart test (or toggle the register to get raw linear) then I guess it is better than nothing.
| |
00:46 | Bertl | and you say, well, if it fits a curve, your plot is wrong
| |
00:47 | troy_s | Well not quite
| |
00:47 | troy_s | a linear image is deadly easy to see
| |
00:47 | troy_s | and it almost certainly doesn't feel close to a linear image being displayed on an sRGB close device.
| |
00:47 | troy_s | (but a face would be easier perhaps)
| |
00:47 | Bertl | I don't see how you could see a linear image from single colors
| |
00:48 | troy_s | A linear representation in data looks wholly "crunched" down
| |
00:48 | Bertl | there are no gray values in this image, except from the change of light (which falls from one side)
| |
00:48 | troy_s | Very "dark" to use a low level guess.
| |
00:48 | troy_s | the bright region (and notably yellow) would likely be much more crunched.
| |
00:49 | FergusL | very dark with only the highly exposed areas of the image shining through
| |
00:49 | troy_s | Exactly
| |
00:49 | FergusL | taht's roughly what a "linear image" looks like, even though that doesn't make much sense
| |
00:49 | Bertl | I think this assumption might hold on a real world scene, but I do not see how it applies to this artificial image
| |
00:49 | troy_s | FergusL: Can you quickly invert an srgb curve on that image and post it?
| |
00:50 | troy_s | Bertl: It is tape on a wall correct?
| |
00:50 | FergusL | let me do this
| |
00:50 | Bertl | tape on a white paper (A4)
| |
00:51 | troy_s | Bertl: I have looked at a few images in my time, and that is how I can roughly guess what it likely looks like in the physical scene.
| |
00:52 | troy_s | I _may_ very well be incorrect, but the telltale falloff on the yellow tape
| |
00:52 | troy_s | from bottom to top
| |
00:52 | troy_s | suggests a gradient that is almost certainly not a linear representation dumped to an sRGB display curve.
| |
00:52 | troy_s | FergusL: Agree?
| |
00:53 | FergusL | hm, agreed yes
| |
00:54 | Bertl | so why not figure how we can 'measure' it with what we(I) currently have available?
| |
00:54 | troy_s | Bertl: Because I am not as clever as you?
| |
00:54 | troy_s | If you place a light source close
| |
00:54 | troy_s | and measure it
| |
00:55 | troy_s | the measurement of X distance (near left edge perhaps) should be 1/4 the rgb values at 2x
| |
00:55 | troy_s | 1/d^2 of course
| |
00:55 | troy_s | (assuming a simple pinpoint tungsten fixture, that SHOULD hold roughly true. EG No reflector or diffuse source)
| |
00:56 | troy_s | so given light from lamp to left edge as X
| |
00:56 | Bertl | okay, so I get a small lamp which can be considered a point source
| |
00:56 | troy_s | lamp to right edge as a ratio of x
| |
00:56 | troy_s | the rgb values, if linear, should hold according to 1/d^2
| |
00:56 | Bertl | put it nearby the paper and take a picture, yes?
| |
00:56 | troy_s | Does that make sense?
| |
00:56 | troy_s | Yes
| |
00:57 | troy_s | raking along the paper
| |
00:57 | troy_s | in a gradient
| |
00:57 | troy_s | bright to dark
| |
00:57 | troy_s | whatever the distance ratio is
| |
00:57 | troy_s | should hold up in data according to the inverse square law
| |
00:57 | Bertl | sounds reasonable
| |
00:58 | troy_s | so if say, the distance from left edge is 2x
| |
00:58 | troy_s | then 1/4 values can be estimated.
| |
00:59 | troy_s | (you can probably cheat the source to get you more easily calculated values too.)
| |
00:59 | troy_s | Just avoid reflectors like light hoods etc.
| |
01:00 | Bertl | let me see what lamps I can find
| |
01:01 | troy_s | it is also plausible that they encode some form of a curve in the data that maximizes the data capture range similar to a log curve.
| |
01:02 | troy_s | (that is effectively a form of compression)
| |
01:02 | Bertl | I actually doubt that
| |
01:02 | troy_s | Well it isn't unheard of. Most commercial cameras offer a log mode.
| |
01:03 | troy_s | For just that reason. A purely linear image 'wastes' many data values in terms of image collection)
| |
01:03 | Bertl | sure, but the specification of a sensor for medical imaging should at least give a hint
| |
01:03 | troy_s | Yeah
| |
01:04 | troy_s | I am no sensor guru on that front however.
| |
01:04 | troy_s | I am surprised they don't document all of this stuff if it is for some form of precision medical use.
| |
01:05 | troy_s | I glanced at the HDR line skipping technique etc, and I would have expected more documentation on how to interpret the blasted data.
| |
01:06 | FergusL | regarding HDR there is more than just line skipping
| |
01:06 | FergusL | on this sensor, that is
| |
01:07 | FergusL | (hu... I'm on the wrong computer right now, I don't have much to work with images)
| |
01:07 | FergusL | (I'm tempted to just try with nuke)
| |
01:08 | troy_s | Egads I would love to shoot with that sensor if it worked and had a decent bit of proper workflow output footage.
| |
01:09 | troy_s | FergusL: It does a window mode and a line skip mode.
| |
01:09 | troy_s | Where it appears to read two separate readings at intervals for window mode.
| |
01:09 | FergusL | you consider it's not working ?
| |
01:13 | troy_s | Not at all.
| |
01:13 | troy_s | I was just looking at the data sheet for the HDR modes.
| |
01:14 | troy_s | Anyways... must sleep. Pleasant to finally see an image!
| |
01:14 | FergusL | thanks for your help !
| |
01:14 | troy_s | G'nite folks. Poke me via PM for anything to avoid getting buried in the scrollback buffer.
| |
01:14 | troy_s | Bertl: G'nite.
| |
01:14 | troy_s | (email works too)
| |
01:16 | FergusL | going to sleep too, btw
| |
01:16 | Bertl | have a good sleep
| |
01:17 | FergusL | thanks, I'll be there hopefully "earlier" tomorrow
| |
01:17 | FergusL | maybe we can talk ("work" ?)
| |
01:18 | Bertl | yeah, I'll do some tests with the lamp
| |
01:19 | FergusL | cool, night !
| |
06:40 | Sasha_C | joined the channel | |
07:10 | se6astian | joined the channel | |
07:10 | se6astian | morning!
| |
07:10 | Sasha_C | Good morning Sebastian. I'm moments away from sending out an email to the team, based on a discussion Herbert and I had earlier today
| |
07:11 | Sasha_C | How are you feeling? Still stressed?
| |
07:20 | Sasha_C_ | joined the channel | |
07:20 | Sasha_C | left the channel | |
07:20 | Sasha_C_ | changed nick to: Sasha_C
| |
07:31 | se6astian | hi, a bit less stressed today :)
| |
07:32 | Sasha_C | good to hear :D
| |
07:57 | dmj_nova | Bertl: FergusL: If I remember correctly, what the sensor documentation *claims* it does is provide up to 3 regions with different slope.
| |
07:58 | dmj_nova | I think each region is supposed to be linear.
| |
07:58 | dmj_nova | This could bear checking, however.
| |
08:04 | se6astian | Sasha_C, replied
| |
08:04 | Sasha_C | Thanks, I can see it
| |
08:04 | se6astian | dmj_nova, you refer to the PLR HDR mode right?
| |
08:04 | Sasha_C | just about to reply back ;)
| |
08:05 | dmj_nova | se6astian: yes
| |
08:05 | dmj_nova | I'm not sure if this is being used
| |
08:05 | dmj_nova | but that could explain non-linearity
| |
08:05 | se6astian | the name carries the answer: piecewise LINEAR response mode ;)
| |
08:06 | se6astian | explain which nonlinearity, did I miss everything again during the night?
| |
08:07 | dmj_nova | yes
| |
08:08 | dmj_nova | this image: http://vserver.13thfloor.at/Stuff/AXIOM/RAW/colors_500ms_composed.png
| |
08:09 | dmj_nova | troy_s claimed it appeared to have a TRC applied.
| |
08:09 | dmj_nova | and they were discussing how to test the response curve of the camera
| |
08:10 | se6astian | "TRC"?
| |
08:11 | dmj_nova | I'm not familiar with the acronym but from context, I think it's a curve applied to the data that makes it not linear light response
| |
08:26 | se6astian | hmm, I dont really follow
| |
08:27 | se6astian | Sasha_C, did you summarize any of the recent developments for the article yet?
| |
08:27 | se6astian | maybe we can get the new article done today
| |
08:32 | Sasha_C | I haven't added last night's developments yet
| |
08:33 | Sasha_C | But I can try to have it done by the end of this night
| |
08:36 | se6astian | it was a google doc right?
| |
08:36 | se6astian | can you give me what you have so far and I put it into the article layout for now
| |
08:36 | Sasha_C | well, the google doc only has the text that I've copied and pasted from my IRC window
| |
08:37 | Sasha_C | Give me fifteen minutes and I'll have it ready for you
| |
08:40 | se6astian | thanks
| |
08:58 | Sasha_C | Sebastian, here's the link to the google doc: https://docs.google.com/document/d/1AtexpX8avHEXORvTlnYrDMPn69FQv-uwM62DHqtSXok/edit
| |
09:03 | se6astian | thanks
| |
09:25 | se6astian | creating floorpln animation of the FPGA now
| |
09:25 | Sasha_C | great. Out of curiosity, who is the anonymous pumpkin, coyote and duck in the google doc?
| |
09:27 | se6astian | random names google generates
| |
09:27 | Sasha_C | i know, but I'm trying to figure out who else in this IRC room is looking at the doc
| |
09:27 | mars_ | its me ^^
| |
09:28 | Sasha_C | Thanks Mars
| |
09:28 | Sasha_C | Sebsatian, what else would you like me to add to the document?
| |
09:52 | se6astian | https://www.apertus.org/axiom-alpha-opens-eyes
| |
09:53 | se6astian | well now we need to add the first captured image
| |
09:53 | se6astian | and then the second one that fixes most of the problems in the first one ;)
| |
09:55 | Sasha_C | Just read the article, looking FANTASTIC!
| |
09:55 | Sasha_C | Nice video :D
| |
09:57 | se6astian | :)
| |
10:14 | Sasha_C | Hey, I'm currently afk. I'll be back in 30mins
| |
10:50 | se6astian | lunchtime
| |
11:09 | FergusL | hi here
| |
11:18 | se6astian | back
| |
11:18 | se6astian | hi FergusL
| |
11:19 | se6astian | slightly updated: https://www.apertus.org/axiom-alpha-opens-eyes
| |
11:19 | se6astian | continuing to expand the article now
| |
11:36 | se6astian | ok I have the basic layout finished
| |
11:36 | se6astian | I think I will add a picture of the prototype assembly when I am at home as well
| |
11:36 | Sasha_C | I'm back
| |
11:42 | se6astian | renamed the article
| |
11:42 | se6astian | https://www.apertus.org/axiom-alpha-first-images
| |
11:42 | se6astian | anything else we should add?
| |
11:42 | se6astian | next steps maybe
| |
11:42 | Bertl | morning everyone!
| |
11:42 | Sasha_C | Sebastian, I just want to edit one sentence in the first paragraph. It'll be a minor change.
| |
11:43 | Sasha_C | morning Bertl
| |
11:53 | se6astian | hi Bertl
| |
11:53 | se6astian | go ahead Sasha_C
| |
12:06 | FergusL | Hi Bertl
| |
12:10 | Bertl | hey, it looks like the sensor data is linear after all :)
| |
12:10 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/linear_2.svg
| |
12:10 | Sasha_C | Done, and looking great (if I say so myself): https://www.apertus.org/axiom-alpha-first-images
| |
12:14 | FergusL | does this look linear to you ?
| |
12:14 | Bertl | yes, nice article! good work everyone!
| |
12:14 | Bertl | FergusL: yes, the curves are what we expect assuming a linear sensor
| |
12:15 | Bertl | i.e. light will behave like the dark curves
| |
12:15 | FergusL | (linear is not always what we think it is)
| |
12:15 | Bertl | and the more colorful curves are what we measure
| |
12:15 | FergusL | yes
| |
12:15 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/linear_1.svg
| |
12:16 | Bertl | here is measured divided by expected
| |
12:16 | FergusL | I see
| |
12:16 | Bertl | i.e. a horizontal line means constant factor
| |
12:16 | FergusL | yes, I can get that
| |
12:16 | FergusL | but hm...
| |
12:17 | FergusL | in the other image, the dark lines are the "ideal" behaviour ?
| |
12:17 | FergusL | like, physically rigorous
| |
12:17 | Bertl | yes, that would be according to physics, the curve a linear system would show
| |
12:17 | Bertl | basically 1/squar-distance corrected by lambert
| |
12:18 | FergusL | okay
| |
12:18 | FergusL | I'm still missing some points of the theory, even after troy_s explained it to me countless times
| |
12:19 | FergusL | because "visually", the dark lines certainly don't look linear
| |
12:19 | FergusL | linear as in 1 + 1 = 2
| |
12:22 | FergusL | Bertl: as troy_s mentioned too, what we're discussing right now is what many users will want to know and trust, so I could suggest to write an article about this at some point
| |
12:23 | FergusL | it would clearly target an educated audience but could also sum up about the theory lying behind at the same tie
| |
12:23 | FergusL | time*
| |
12:23 | FergusL | CC se6astian ^
| |
12:25 | se6astian | we can add it to this article if we want
| |
12:25 | se6astian | is there anything else you think should be added
| |
12:25 | se6astian | I want to cover next steps briefly
| |
12:25 | se6astian | and add a picture of the prototype assembly which I have on my home pc
| |
12:27 | FergusL | I'm assuming you want to release this article asap, Bertl's/our investigations regarding linearity can still use some time I think, basically it can wait
| |
12:29 | Bertl | yes, and don't forget, it is not a precise measurement or calibration, it is just a quick test to check the data
| |
12:29 | FergusL | yes, too
| |
12:29 | FergusL | from what I understood, we could get an IT8 chart thanks to troy_s
| |
12:29 | FergusL | maybe wait until then
| |
12:31 | Sasha_C | Good decision. And on that note, I have to go get some sleep so I can wake up on time for work tomorrow :(
| |
12:31 | FergusL | there was a very short discussion on cml recently about Axiom
| |
12:32 | Bertl | Sasha_C: make that! thanks again and have a good night!
| |
12:32 | FergusL | thanks Sasha_C, bonne nuit !
| |
12:32 | Sasha_C | left the channel | |
12:32 | se6astian | cml?
| |
12:33 | FergusL | cinematography mailing-list, I've already talked about it, I think
| |
12:36 | se6astian | ah great
| |
12:38 | FergusL | I think cml rules include that mails are not to be published outside of the mailing list but basically there were some doubts expressed about the modular desing
| |
12:43 | FergusL | especially regading power connections : if the battery is at the back and the camera at the front, that makes several connections in series, which isn't great
| |
12:43 | FergusL | that's a valid point I believe
| |
12:44 | se6astian | interesting to see a discussion on such a level there, good signs :)
| |
12:44 | FergusL | absolutely, I'm still monitoring cml for more mails about Apertus, I'll make sure to report to the community
| |
12:45 | se6astian | great
| |
12:45 | se6astian | about the power lines we already had a discussion in Brussels when we were in the train back to the airport
| |
12:46 | se6astian | there are several options IIRC like having the power go in a loop from battery to parts and from battery to head and from there back again
| |
12:46 | se6astian | Bertl will surely remember more of the technical details
| |
12:46 | FergusL | there could also be a single cable directly connection, also that doesn't fit entirely in the modular paradigm
| |
12:46 | FergusL | I see
| |
12:48 | Bertl | yes, power management and distribution will be something we have to design and test carefully in the modular concept
| |
12:48 | Bertl | worst case scenario is a primitive power plug on every module, directly connected to a battery pack :)
| |
12:50 | Bertl | the main problems with power is the distribution, for example, it would be very efficient (power consumtion wise) to generate the required voltages in one place (e.g. 1.8V, 2.5V, 3.3V, 5V)
| |
12:50 | FergusL | http://www.afcinema.com/-2014-.html ever heard about the AFC ? it's basically BSC or ASC in France, they hold a show every year
| |
12:50 | FergusL | maybe some day we could have a booth there
| |
12:50 | Bertl | OTOH, when we transport 5V over 5 modules, we will get like 3V at the end, if all those modules consume a certain amount of power
| |
12:51 | FergusL | I see
| |
12:51 | Bertl | (unles we use super conductors :)
| |
12:51 | Bertl | *unless
| |
12:52 | FergusL | it's the same effect that makes looong unshielded cables weak ?
| |
12:52 | FergusL | like, over 10 meters you lose X volts ?
| |
12:53 | Bertl | yup
| |
12:53 | Bertl | it is called resistance
| |
12:54 | Bertl | http://en.wikipedia.org/wiki/Electrical_resistance
| |
12:54 | FergusL | ha yes, because even a wire has a resistor value, it's insignificant in most case but becomes important in longer distance
| |
12:55 | Bertl | not just distance, it also depends on the power
| |
12:55 | Bertl | or to be precise, on the current
| |
12:56 | Bertl | http://en.wikipedia.org/wiki/Ohm%27s_law
| |
12:56 | Bertl | so basically if you have 10V at 1A (on the receiving side)
| |
12:57 | Bertl | this equals a 'resistance' of 10 Ohm
| |
12:57 | Bertl | now if the wire adds 1 Ohm, you already need 11V at the provider
| |
13:03 | FergusL | I'm more used to using Ohm's law as U = RI, because most of the time R is known
| |
13:05 | Bertl | well, that's fine here, the R = Rwire + Rconsumer
| |
13:05 | Bertl | and they build a voltage divider
| |
13:06 | FergusL | yes, I'm aware of these concepts
| |
13:06 | Bertl | where the voltage at the consumer is Rconsumer/R
| |
13:06 | Bertl | times the provider voltage
| |
13:07 | FergusL | how are connectors treated in this matter ? like wires touching each other
| |
13:07 | FergusL | as an equivalent resistance ?
| |
13:09 | Bertl | they have a resistance, yes
| |
13:09 | FergusL | I get it
| |
13:09 | Bertl | they also cause strange effects because different metals are involved
| |
13:10 | FergusL | I do some electronics, Arduino and stuff, but I'm more into controllers, specifically controllers for music making
| |
13:10 | Bertl | mostly thermoelectric effects
| |
13:10 | FergusL | I see, didn't know that
| |
13:13 | Bertl | http://en.wikipedia.org/wiki/Seebeck_effect#Seebeck_effect
| |
13:16 | FergusL | do you have knowledge about audio in electronics ? amplification, signal path and power
| |
13:16 | Bertl | somewhat
| |
13:20 | FergusL | I have as a project to build a small amplificator board (and maybe including an USB dac as well) that would be portable and go with small embedded Linux systems... but... that's off-topic !
| |
13:30 | FergusL | back to the topic, Bertl, your experiment would suggest that the images coming from the sensor are linear ? did you test pre or post debayer ?
| |
13:31 | FergusL | pre, I believe ?
| |
13:31 | Bertl | everything on the raw data, so no debayering or similar
| |
13:31 | Bertl | all I did was split up the data according to the channel
| |
13:32 | Bertl | (4 channels, 1 red, 1 blue, 2 green)
| |
13:33 | FergusL | I see
| |
13:34 | FergusL | do you have the code somewhere for that "SIMPLE" debayer algorithm ?
| |
13:35 | FergusL | I might just be wasting your time if I'm wrong but I'm suspecting it's not linear
| |
13:35 | Bertl | sure, but it wasn't used
| |
13:36 | FergusL | of course it wasn't, yes
| |
13:36 | FergusL | am just curious of what it does, especially regarding tone curve correction
| |
13:37 | Bertl | https://github.com/jdthomas/bayer2rgb
| |
13:38 | Bertl | https://github.com/jdthomas/bayer2rgb/blob/master/bayer.c line 569
| |
13:40 | FergusL | (
| |
13:41 | FergusL | https://github.com/jdthomas/bayer2rgb/blob/master/bayer.c#L569 <- this works :)
| |
13:41 | FergusL | oh noes, bit shifting, why why why é_è
| |
13:51 | Bertl | well, AFAICT, it's a perfectly fine integer division with rounding
| |
13:51 | Bertl | (a + b + 1) >> 1
| |
13:52 | FergusL | yes yes, it's just an old fear of mine
| |
13:52 | Bertl | as the resulting data is not further processed in any way, it is probably as good as it gest
| |
13:52 | Bertl | *gets
| |
13:55 | FergusL | you're refering to the debayer there, right ?
| |
13:55 | FergusL | apparently, no, it's not processing in any way
| |
13:57 | Bertl | yep
| |
13:59 | se6astian | any thoughts on the apertus° association accepting bitcoins for donations in the future?
| |
14:01 | Bertl | why not
| |
14:02 | FergusL | I'm all for it
| |
14:04 | FergusL | I'm all for anything that would put us more in the path of things we believe in: transparency, independance and unbound honesty
| |
14:06 | se6astian | agreed
| |
14:08 | FergusL | ultimately that'd also include leaving Google, Facebook, Paypal, capitalism, "agressive business model"
| |
14:08 | FergusL | (I'm not saying we have an agressive business model)
| |
14:11 | FergusL | Bertl: did you check the assumed bayer pattern is the right one ?
| |
14:11 | FergusL | hm, nevermind, that would be obvious if you didn't
| |
14:12 | Bertl | yep, actually I tried all of them and took the one which looked the least ugly :)
| |
14:13 | FergusL | it wouldn't hurt to write our own debayer
| |
14:13 | FergusL | well, it would do the same exactly
| |
14:14 | Bertl | originally I wanted to use dcraw, because it is kind of famous in the community
| |
14:15 | FergusL | yes
| |
14:15 | Bertl | but I didn't manage to get it working with real raw data (i.e. bayer mosaic data)
| |
14:15 | FergusL | I'm interested in investigating in this side
| |
14:15 | Bertl | and the folks I asked told me, it would be best to write my own 'raw' importer for dcraw, to read the raw files :)
| |
14:16 | FergusL | it's even possible to compute the debayering as a node system in Blender or else
| |
14:16 | Bertl | so it seems that dcraw cannot read 'raw' after all, it can only read the various proprietary formats the companies call 'raw'
| |
14:16 | FergusL | (maybe that could be a good think to make it easy to understand to people)
| |
14:17 | FergusL | what's the "format" you get ? is it a standard of some sort ?
| |
14:18 | Bertl | go crazy, you have the 'real raw' data, give it a try and see how it looks, if it's fine and can be pipelined in an efficient way, I can use it
| |
14:18 | Bertl | the 'format' is the one I described on the dev mailing list
| |
14:19 | Bertl | basically 12 bit padded to 16bit gray values in big endian, arrange in the RGGB bayer pattern
| |
14:19 | FergusL | it's also rather trivial to write it for ImageMagick with Python
| |
14:20 | FergusL | RGGB, got it
| |
14:20 | Bertl | no headers, no additional metadata yet
| |
14:21 | Bertl | although I will probably dump the registers in the near future at the end
| |
14:21 | Bertl | i.e. just dump out all the sensor registers one after the other for later processing
| |
14:21 | se6astian | gabe said he is working on a DNG converter
| |
14:21 | Bertl | yep, I hope it will become useful soon
| |
14:22 | FergusL | very useful to make it usable in decent application for further checking and investigation
| |
14:22 | FergusL | linear formats would be cool be cool at some point too, dpx or exr
| |
14:22 | Bertl | well, I can't believe that photoshop and friends cannot read bayer data as gray image
| |
14:22 | FergusL | Bertl: I'm not sure I understand, how different would this dump be from the raw data you get ?
| |
14:23 | Bertl | it would be what we have now <raw image data> plus <register dump> appended at the end
| |
14:23 | FergusL | Blender should, if you mark the input colorspace as... hm.. what it it called ? "none" or something
| |
14:23 | Bertl | so instead of 4096x3072x2 bytes, there would be additional 128*2 bytes register values
| |
14:23 | FergusL | ohhh, the registers, absolutely, that'd be metadata, right ?
| |
14:24 | Bertl | for post processing, just remove the last 256 bytes or ignore them
| |
14:24 | Bertl | and for analysis, just read the last 256 bytes and you know exactly what the sensor was configured for
| |
14:25 | FergusL | ok, I get it, thanks
| |
14:25 | Bertl | np
| |
14:32 | FergusL | opencolorIO and openimageIO (http://opencolorio.org/index.html https://sites.google.com/site/openimageio/home) are good tools to look into as well
| |
14:32 | FergusL | it's like imagemagicks made particularly for our field
| |
14:36 | se6astian | going home from my office now
| |
14:36 | se6astian | see you
| |
14:36 | se6astian | left the channel | |
14:39 | FergusL | Bertl: what was used for the export to .png ?
| |
14:39 | Bertl | okay, just added that register dump feature to my tool
| |
14:39 | FergusL | cool
| |
14:39 | FergusL | is that code somewhere ?
| |
14:39 | Bertl | I'm using ImageMagick for almost everything
| |
14:39 | FergusL | I see
| |
14:39 | Bertl | I haven't uploaded it yet, but I will upload it soon
| |
14:39 | FergusL | github ?
| |
14:41 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/cmv_snap.c
| |
14:41 | Bertl | here you go :)
| |
14:42 | FergusL | thanks
| |
14:44 | FergusL | my C is like my german, I'm still working on it
| |
14:52 | FergusL | what are the mmap() for ?
| |
14:52 | FergusL | is is the step writing to registers ?
| |
14:52 | FergusL | though it's also writing to registers later
| |
14:53 | Bertl | everything on the arm cortex platform is somewhere in memory
| |
14:53 | Bertl | the sensor registers, the data collected from the sensor, etc
| |
14:53 | Bertl | the mmap maps this physical memory to a virtual memory region
| |
14:54 | Bertl | (so that it can be directly accessed)
| |
14:56 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/cmv_train.c
| |
14:56 | Bertl | this is the code to train the LVDS channels
| |
16:15 | FergusL | Bertl: train ? they need to train ?
| |
16:15 | FergusL | like, otherwise, we're back to NTSC ?
| |
16:35 | Bertl | well, yes, the process of adjusting the bitstreams (delay wise) is called training
| |
16:43 | FergusL | I see
| |
16:44 | FergusL | do I have the time to do some tests on Nuke hm..
| |
16:57 | dmj_nova | left the channel | |
17:18 | troy_s | Bertl: Tested? I recommend OIIO compiled with OCIO and iv tool.
| |
17:18 | FergusL | ding ding :)
| |
17:18 | troy_s | Bertl: OCIO allows a well documented transform on both the TRC (tone response curve) and the primaries.
| |
17:19 | Bertl | if you can elaborate on the acronyms, maybe I can comment :)
| |
17:19 | FergusL | this is the linked I put there earlier
| |
17:19 | FergusL | open <foo> IO
| |
17:19 | troy_s | Bertl: Would be good to see if the vendor will supply native primaries. I could do up a LUT configuration.
| |
17:20 | troy_s | Bertl: OCIO = Sony (Jeremy Selan's) OpenColorIO
| |
17:20 | troy_s | Bertl: OIIO = Sony (Larry Gritz's) OpenImageIO
| |
17:21 | troy_s | Bertl: TRC = Tone Response Curve (aka Transfer Curve) (aka the wrong term Gamma)
| |
17:21 | troy_s | Back to work... ping and I will check from time to time.
| |
17:21 | Bertl | okay, well, we probably can do that at some point when we have test charts
| |
17:22 | Bertl | troy_s: what's your comment on the linearity? :)
| |
17:32 | tonsofpcs | left the channel | |
17:32 | tonsofpcs | joined the channel | |
17:35 | troy_s | Bertl: I trust your math. Did you try the test we discussed via lamp?
| |
17:36 | troy_s | Bertl: Can you yank a shot of a generic scene with hots and darks in it?
| |
17:37 | troy_s | Bertl: It is possible the tape was overexposed all to hell... but the image sure looked "normal", which suggests it wasn't linear. Could be subject matter of course.
| |
17:38 | troy_s | Bertl: I only suggested OCIO and OIIO as they are reference baseline motion picture libs. OCIO is in Nuke, Silhouette, etc. Both OCIO and OIIO are in many post houses. Further, it is great if everyone gets to know DPX and EXR as well.
| |
17:42 | dmj_nova | joined the channel | |
17:47 | se6astian | joined the channel | |
18:02 | Bertl | troy_s: everything is possible regarding exposure
| |
18:02 | Bertl | I just took the first image where I could clearly see the tapes and colors
| |
18:02 | Bertl | and yes, I did the discussed lamp test
| |
18:03 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/linear_1.svg
| |
18:03 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/RAW/linear_2.svg
| |
18:04 | Bertl | (details were posted on the devel mailing list)
| |
18:06 | Bertl | regarding generic scene, I can probably take photo or centerfold and capture that :)
| |
18:08 | FergusL | Right now you can't bring the prototype at a window and snap a picture there, right ?
| |
18:09 | Bertl | correct
| |
18:09 | FergusL | I was wondering, at this point, can apertus have interns ?
| |
18:09 | Bertl | we will be able to do that once the case is finished though
| |
18:10 | Bertl | I guess, if they work for free, we can have interns :)
| |
18:13 | FergusL | I'm looking for an internship, actually
| |
18:14 | FergusL | Did you take a look at ocio and oiio ?
| |
18:15 | Bertl | check with se6astian what might be possible and what not (wikipedia says, in france labor law requires a minimum payment)
| |
18:17 | FergusL | (facts say, in France it's very likely you might remain unpaid as an intern, applies to any field)
| |
18:19 | se6astian | well the company is in Belgium
| |
18:19 | se6astian | I have no idea about the law status there or what the requirements would be
| |
18:19 | se6astian | if you just need a signed paper I guess that should be possible
| |
18:19 | FergusL | I don't care about a payment, actually, what I want is bridging the gap between cinema and programming
| |
18:20 | Bertl | that's the spirit!
| |
18:20 | se6astian | but I have to warn you, we are very strict, mean and demanding bosses, this is kindergarden compared to being an intern in apertus: https://lh3.ggpht.com/-jfb2AXC6x6o/TmRIJOnCnsI/AAAAAAAACv4/Ds7nRMJLTdU/s400/galley%2Bslaves.jpg
| |
18:21 | FergusL | I just need a signed paper, indeed, I'll look into the details
| |
18:24 | se6astian | thanks, of course we need to wait for oscars OK as he owns the company
| |
18:24 | FergusL | Uh. I'm going to Red then.
| |
18:25 | Bertl | :)
| |
18:25 | se6astian | but if it does not mean additional work or expenses for him I guess it should be fine
| |
18:25 | se6astian | I bet Red is much cooler, you will surely get Ocley eyewear and regular rides in the company helicopter!
| |
18:26 | se6astian | possibly they will even let you create their entire next product on your own, I only heard that they managed to make the scarlet worse than the Red One :)
| |
18:27 | Bertl | you have to adapt to consumer demands
| |
18:27 | se6astian | added prototype picture and oscars cava picture as link to the article: https://www.apertus.org/axiom-alpha-first-images
| |
18:27 | se6astian | indeed, the customer is king ;)
| |
18:28 | Bertl | ah, cool!
| |
18:35 | dmj_nova | se6astian: isn't the scarlet supposed to be like the stripped down model?
| |
18:37 | se6astian | yes but of the epic, not the red one
| |
18:38 | dmj_nova | that said I've never used any of them
| |
18:38 | dmj_nova | of course the scarlet changed so many times before it was released
| |
18:48 | se6astian | what do you think of this prototype progress ad banner: https://www.apertus.org/axiom-alpha-hardware-complete ?
| |
19:10 | Bertl | besides the fact that it links to a 'Latest News' page which shows news from 3 weeks ago as 'latest' it looks really nice
| |
19:11 | se6astian | that ARE the latest public news :)
| |
19:12 | se6astian | about the prototype
| |
19:12 | Bertl | well, that _is_ sad :)
| |
19:13 | se6astian | a catastrophe!
| |
19:13 | Bertl | next news point will be 'crowd funding completed!' :)
| |
19:18 | se6astian | hopefully! and not "failed"
| |
19:22 | se6astian | would you suggest to mix Axiom and Alpha prototype news?
| |
19:22 | se6astian | currently its strictly seperated
| |
19:24 | Bertl | well, if it can be labeled accordingly then I guess it would make sense
| |
19:26 | se6astian | people tend to mix up the two already so we have to label them quite cleverly
| |
19:27 | se6astian | I dont know how many times I have seen people on forums complain that Axiom will only have a nikon mount for example....
| |
19:27 | Bertl | how about a logo/icon?
| |
19:28 | Bertl | something which goes left of the title text or even over the associated image?
| |
19:28 | Bertl | like for example an 'alpha' or the greek letter maybe?
| |
19:31 | se6astian | like the sony alpha camera? https://upload.wikimedia.org/wikipedia/commons/0/00/Sony-Alpha-A700-Front.jpg
| |
19:31 | se6astian | might get us sued :)
| |
19:32 | se6astian | but a symbol is a good idea
| |
19:32 | Bertl | I don't think that a letter can be registered or trademarked (yet)
| |
19:33 | se6astian | the "geschmacksmuster" most likely is protected
| |
19:33 | dmj_nova | also one can have an 'alpha' that doesn't look like the Sony one
| |
19:33 | Bertl | otherwise microsoft would have one half of the alphabet and apple the other :)
| |
19:33 | se6astian | what if we used the prototype assembly as a kind of "icon" (like on the left of: https://www.apertus.org/sites/default/files/axiom-alpha-progress-update-ad_0.jpg)
| |
19:33 | se6astian | I like that it looks so "prototypical" with PCBs, cables, etc.
| |
19:34 | Bertl | yeah, well, it is probably too delailed to work as icon
| |
19:34 | dmj_nova | Bertl: how much of the FPGA resources are used at present?
| |
19:34 | dmj_nova | for the very basic functionality it has now
| |
19:35 | se6astian | lets continue this discussion soon, I think we are on the righ track
| |
19:35 | se6astian | gotta go afk for a bit
| |
19:36 | se6astian | ah not for another 15 minutes ;)
| |
19:36 | Bertl | dmj_nova: http://vserver.13thfloor.at/Stuff/AXIOM/utilization.rpt
| |
19:37 | Bertl | we will free up a little when I switch to the Zynq Serdes (which is on my todo list)
| |
19:39 | dmj_nova | Serdes?
| |
19:42 | Bertl | SERial/DESerializer
| |
19:43 | dmj_nova | Bertl: hmm...that's really not that much
| |
19:48 | Bertl | yes it is quite efficient if you leave out all that xilinx IP stuff :)
| |
19:50 | dmj_nova | haha
| |
19:51 | se6astian | good news: another electronics engineer just emailed us and wants to help
| |
19:53 | Bertl | hehe, you don't say :)
| |
19:53 | se6astian | note, you only email your reply to team@aper...
| |
19:53 | se6astian | *emailed
| |
19:54 | Bertl | yeah, the mailing list is obviously sending the wrong headers
| |
19:55 | Bertl | Reply-To: *email address removed*
| |
19:56 | se6astian | yes, well the website form sends all enquiries to the google group
| |
19:57 | Bertl | you might want to add the reply-to header to the envelope
| |
19:58 | se6astian | in the google group settings or for the websites contact form?
| |
19:58 | Bertl | probably in the contact form, unless google groups strip that information
| |
19:59 | se6astian | there are no settings like that in drupal
| |
19:59 | Bertl | check if the google-groups settings force a reply to?
| |
19:59 | se6astian | you can only select the target address
| |
20:00 | Bertl | I just googled for the issue and it seems there is an option
| |
20:01 | se6astian | http://picpaste.com/googlegroups-FyhIasLL.jpg
| |
20:02 | Bertl | I guess the last one would be the proper setting, but I have to test it
| |
20:02 | se6astian | its a strange option to let the user decide, where should they decide when they receive an email?
| |
20:03 | Bertl | well, it is how the headers are formulated
| |
20:03 | Bertl | I can hit reply, or reply-group for example
| |
20:03 | Bertl | where reply would reply to the sender
| |
20:03 | Bertl | just give it a try and send a message via the form
| |
20:04 | Bertl | say that you are looking for a well payed job :)
| |
20:05 | se6astian | "replying only to sender" is also not a good idea as people will then reply only to the other member who posted to the group but nobody else will get the reply if people are not careful
| |
20:05 | Bertl | as I said, let the user decide sounds correct to me
| |
20:05 | Bertl | i.e. it will not force the reply-to header
| |
20:06 | Bertl | you can always add the group in the cc reply
| |
20:06 | se6astian | https://support.google.com/groups/answer/2648318?hl=en
| |
20:06 | se6astian | Users decide where replies are sent - Replies to posted messages are decided on a per-member level.
| |
20:07 | se6astian | to me it sounds like any group meber can set it individually
| |
20:07 | se6astian | lets give it a try
| |
20:09 | Bertl | looks good to me
| |
20:09 | Bertl | now just add the dev group to the cc, like the team one
| |
20:10 | se6astian | hmm, the test mail I sent doesnt work
| |
20:10 | Bertl | hmm?
| |
20:10 | se6astian | it still replies to the group only by default
| |
20:10 | Bertl | that is probably your mail client, but here it works fine
| |
20:10 | Bertl | (i.e. I presume your mail client has a default set to the group)
| |
20:11 | se6astian | gmail webclient
| |
20:11 | Bertl | I'm sure it can be configured somewhere, I don't use gmail
| |
20:12 | se6astian | checking...
| |
20:12 | Bertl | probably an option like 'list reply' or reply to mailing list or something like that
| |
20:12 | se6astian | but if it works for you now I will leave the setting and we will just see if the problem is gone for most people
| |
20:13 | Bertl | yep, works fine for me, thanks
| |
20:14 | se6astian | great
| |
20:14 | se6astian | ok now I really gotta go
| |
20:14 | se6astian | afk<-
| |
20:15 | Bertl | cya
| |
20:18 | troy_s | Bertl: Is there a public repo of the apertus mailing list?
| |
20:20 | Bertl | https://www.apertus.org/mailinglists
| |
20:20 | Bertl | I don't think so
| |
20:21 | Bertl | but we probably could/should make axiom-dev public in the near future
| |
20:22 | troy_s | Bertl: Is there a copy of your results from the low tech test?
| |
20:22 | [1]se6astian | joined the channel | |
20:23 | Bertl | give me you e-mail (pm) and I'll bounce it to you
| |
20:24 | [1]se6astian | I can also add you to the list if you plan to contribute/hang around in the future?
| |
20:25 | se6astian | left the channel | |
20:25 | [1]se6astian | changed nick to: se6astian
| |
20:39 | troy_s | se6astian: Sure.
| |
20:44 | se6astian | pm me your email address please
| |
20:53 | troy_s | I am not going to lie... that is an impressive test there Bertl. I don't quite understand the axes, but pretty impressive nonetheless.
| |
20:56 | Bertl | it is rather simple (the graphs) basically x is the inverse distance from the projected point light source
| |
20:57 | Bertl | i.e. the light is on the right side
| |
20:57 | Bertl | and y is the measured/calculated intensity
| |
20:58 | Bertl | in he curvy graph it is plotted as is, in the more horizontal graph one divides the other
| |
20:58 | Bertl | i.e. horizontal line would be a perfect match
| |
20:59 | Bertl | I think the peaks at the left end are due to back level noise
| |
20:59 | troy_s | So at 800 if the value is 50 (arbitrary fake value) then at 400 it should be 1/4 that?
| |
20:59 | Bertl | well, not exactly, we 'forgot' the cosine law yesterday
| |
21:00 | troy_s | Bertl: Sensors should fall apart near the edges of the sensitivity ranges. There is likely a good test there for sampling the color primaries as which values to toss out.
| |
21:00 | Bertl | i.e. the light hits the paper at an angle, which affects the intensity seen from the camera proporional to the cosine of the angle between light source and normal
| |
21:00 | troy_s | Hrm?
| |
21:01 | Bertl | think about it like this, if you hit a unit square from the top, it will be well illuminated
| |
21:01 | troy_s | Gosh that wikipedia article uses the word luminance for perceptual lightness / brightness. Yuck.
| |
21:02 | troy_s | (Luminance strictly means the radiometrically correct value (Y), where luma is the TRC luminance (Y'))
| |
21:02 | dmj_nova | ah yes, shading models
| |
21:02 | troy_s | Bertl: Gotcha.
| |
21:02 | Bertl | so we assume an ideal diffuse paper here, which I think is a good approximation
| |
21:03 | troy_s | Bertl: I usually am holding a diffuse meter when using the inverse square... not looking via a spot meter.
| |
21:03 | Bertl | and in contrary to our test setup, you are probably pointing it to the light source as well :)
| |
21:03 | Bertl | but for the given setup, it has to be accounted :)
| |
21:05 | troy_s | Bertl: I am still worried that the test shot of the DVD cover looks TRCd.
| |
21:05 | Bertl | so yes, I probably over exposed the color band image
| |
21:05 | troy_s | Bertl: Was that image augmented in some way such as exposure or?
| |
21:05 | Bertl | no, but the DVD cover was without the IR/UV cutoff
| |
21:06 | Bertl | so you are basically seeing a lot of IR as well
| |
21:06 | troy_s | Because there is no way in hell that a linear typical image will resolve correctly on an LDR display. (Granted, it is an LDR paper source and jacking the exposure would likely bring it into range)
|