00:15 | comradekingu | left the channel | |
01:18 | pizthewiz | left the channel | |
01:20 | Bertl | off to bed now ... have a good one everyone!
| |
01:20 | Bertl | changed nick to: Bertl_zZ
| |
02:56 | pizthewiz | joined the channel | |
03:04 | pizthewiz | left the channel | |
03:15 | pizthewiz | joined the channel | |
03:15 | pizthewiz | left the channel | |
04:32 | intracube | changed nick to: intracube_
| |
08:16 | g3gg0 | joined the channel | |
09:07 | Andrej_ | left the channel | |
09:22 | se6astian|away | changed nick to: se6astian
| |
09:23 | se6astian | good morning
| |
09:28 | lab-bot | sebastian created T112: Standard icons for apertus.org project pages to github and wiki. http://lab.apertus.org/T112
| |
09:58 | lab-bot | sebastian created T113: CMV12000 PCB module pogress. http://lab.apertus.org/T113
| |
09:58 | lab-bot | sebastian created T114: Remote Control Simulator draft. http://lab.apertus.org/T114
| |
10:02 | se6astian | updated wishlist table
| |
10:02 | se6astian | https://www.apertus.org/en/dictator
| |
10:03 | se6astian | gotta go
| |
10:03 | se6astian | changed nick to: se6astian|away
| |
13:14 | aombk2 | changed nick to: aombk
| |
13:50 | PhilCee | joined the channel | |
13:51 | Bertl_zZ | changed nick to: Bertl
| |
13:51 | Bertl | morning folks!
| |
13:58 | lab-bot | davidak created T115: Breadcrumb Links don't work. http://lab.apertus.org/T115
| |
14:30 | PhilCee | left the channel | |
16:04 | Juicyfruit | changed nick to: Juicman42
| |
16:04 | Juicman42 | changed nick to: Juiceman42
| |
16:08 | Juiceman42 | changed nick to: Juicyfruit
| |
16:13 | slikdigit | joined the channel | |
16:47 | bassam__ | joined the channel | |
16:47 | slikdigit | left the channel | |
16:47 | bassam__ | changed nick to: slikdigit
| |
16:47 | slikdigit | left the channel | |
16:47 | slikdigit | joined the channel | |
16:52 | slikdigit | left the channel | |
18:30 | intracube_ | changed nick to: intracube
| |
18:34 | se6astian|away | changed nick to: se6astian
| |
18:35 | wescotte | joined the channel | |
18:37 | se6astian | good evening
| |
18:38 | intracube | evening se6astian
| |
18:44 | Bertl | evening!
| |
19:22 | philippej | joined the channel | |
20:00 | Bertl | hey philippej!
| |
20:03 | philippej | Hello Bertl :-)
| |
20:04 | lab-bot | Bertl created T118: Check-in CMOSIS Datasheets to the Beta Repository. http://lab.apertus.org/T118
| |
20:16 | FergusL | left the channel | |
20:17 | FergusL | joined the channel | |
20:26 | lab-bot | sebastian closed T118: Check-in CMOSIS Datasheets to the Beta Repository as "Resolved". http://lab.apertus.org/T118
| |
20:52 | lab-bot | philippej closed T86: can we change the default applications in phabricator left hand menu? as "Wontfix". http://lab.apertus.org/T86
| |
20:52 | philippej | can anyone tell my what "spite" mean in the realm of bug / task status ?
| |
20:54 | troy_s | philippej: I suspect it is the sarcastic nature of Phabricator
| |
20:55 | troy_s | philippej: â So, I think features like these actually make the software better, which is why you can close tasks "Out of Spite" in Maniphest and the submit button in Differential is labeled "Clowncopterize". Plus, they're fun to build.â
| 20:55 | philippej | looks for the translation of out of spite
|
20:56 | philippej | I see
| |
20:56 | troy_s | philippej: I believe that particular one is malicious closure
| |
20:56 | troy_s | As in if you kept bothering me about something and I tried to make my case but you kept at it
| |
20:56 | troy_s | Out of Spite would be a good reason
| |
20:56 | philippej | :-)
| |
20:56 | troy_s | It is close to a wontfix I would think
| |
20:56 | troy_s | But more edged.
| |
20:56 | philippej | we could create some additional status if needed btw
| |
20:57 | troy_s | I think it is hilarious
| |
20:57 | troy_s | It is a Linus Torvalds closure tag really. :)
| |
20:57 | troy_s | Without the f-bombs
| |
20:57 | philippej | I guess it's stuff like that that got me instantly hooked to phabricator
| |
20:57 | troy_s | (Although I could see a closure due to a nasty hard-to-find bug being spiteful too.
| |
20:57 | troy_s | Yep
| |
20:57 | philippej | this and the fact that it's particularly well written and thought
| |
20:58 | troy_s | philippej: â Thanks to Phabricator I can close the guy everyone hates' issues out of spite.â
| |
20:58 | philippej | on install, it gives you advice on everything, like "the size of your mysql cache is a bit too small", etc ...
| |
20:58 | philippej | haha
| |
20:59 | troy_s | â deliberately hurt, annoy, or offend (someone).â
| |
20:59 | troy_s | That is the definition of spite from Google's top search.
| |
20:59 | troy_s | Hopefully that translates better for you.
| |
21:00 | philippej | perfectly, thanks
| |
21:00 | troy_s | Breadcrumb links not working philippej ?
| |
21:01 | philippej | breadcrumb are tricky in drupal ...
| |
21:01 | Bertl | troy_s: I think they should work, but I haven't seen any labels for that in the source of the pages
| |
21:01 | troy_s | The hash tag one works here.
| |
21:01 | philippej | you have at least 10 add on modules to manage
| |
21:01 | troy_s | The one without doesn't.
| |
21:02 | philippej | if a page is part of the menu the breadcrumb will show
| |
21:02 | troy_s | I just saw the Phab bug
| |
21:02 | philippej | else it won't
| |
21:02 | troy_s | Bertl: Back to the dungeon you!
| |
21:02 | philippej | what would be a decent behavior, that's the question
| |
21:02 | troy_s | J'adore Phabricator.
| |
21:03 | troy_s | This is slick http://files.apertus.org/controller-simulator/
| |
21:04 | philippej | ça rime en plus -> it rhymes :-)
| |
21:04 | troy_s | Sony's is extremely well designed graphically speaking.
| |
21:05 | FergusL | left the channel | |
21:06 | FergusL | joined the channel | |
21:08 | lab-bot | left the channel | |
21:08 | lab-bot | joined the channel | |
21:09 | philippej | lab bot will now be a little bit more verbose, it's an experiment, let me know if the guy talks too much
| |
21:10 | Bertl | lab-bot: what's up pal?
| |
21:11 | philippej | unfortunately it's still as dumb as before
| |
21:11 | philippej | but it will notify some more tasks status changes
| |
21:32 | FergusL | left the channel | |
21:33 | FergusL | joined the channel | |
21:33 | lab-bot | philippej closed T30: apertus.org: email notification for comment publish as "Resolved". http://lab.apertus.org/T30
| |
21:44 | troy_s | On MOX http://major-kong.blogspot.co.uk/2014/10/a-mox-on-both-your-house-formats.html
| |
21:44 | troy_s | (That is by a Foundry developer who worked on Nuke of course)
| |
21:47 | lab-bot | sebastian reopened T30: apertus.org: email notification for comment publish as "Open". http://lab.apertus.org/T30
| |
22:07 | lab-bot | philippej claimed T112: Standard icons for apertus.org project pages to github and wiki. http://lab.apertus.org/T112
| |
22:19 | philippej | About MOX, I see the need for some kind of pdf-x for image sequences (fail- and idiot- proof, reduced feature set from a more general purpose format)
| |
22:19 | philippej | very interesting to see where it ends up
| |
22:20 | troy_s | philippej: I worry that MOX has too much of a myopic view on Adobe products. ICCs etc.
| |
22:20 | troy_s | Really awful idea(TM)
| |
22:21 | troy_s | philippej: Your suggestion is a good point, but at that very point you make an ideological judgment on who / what is your target. To optimize for that will ostracize the other designed targets.
| |
22:21 | troy_s | philippej: It speaks volumes that he got the idea from the After Effects forum.
| |
22:22 | philippej | I'm not taking the pdf/x example because it as created by adobe, but more for it's "less is more" philosophy that works extremely well.
| |
22:22 | troy_s | philippej: Whereas Bink's commentary is from the highest end of imaging, and if the high end doesn't adopt it, it's as good as dead.
| |
22:22 | troy_s | Less is never more.
| |
22:22 | troy_s | Less is always less.
| |
22:22 | troy_s | Fsck Mies Van Der Rohe
| |
22:22 | philippej | bring an interactive pdf to the print shop :-)
| |
22:23 | troy_s | (Uniquely Industrial-Modernist. :) )
| |
22:23 | troy_s | PDFs go to the print shop all the time, but not because they are reduced.
| |
22:23 | philippej | then let's say, reducing features sometimes ease of communication between parties
| |
22:23 | troy_s | Rather they tend to target the lowest common denominator.
| |
22:23 | troy_s | Sure. And I totally agree.
| |
22:23 | troy_s | The problem is your implied "parties" there.
| |
22:24 | troy_s | Everyone (much like Apertus) is going to have an opinion. The challenging question is who do you listen to?
| |
22:24 | troy_s | (Depends on goals of course, but a minor design decision can have rather massive implications. For example, if Brendan goes with ICC, there is a less than zero percentage chance that any post house anywhere will touch the format. It is a still-born.)
| |
22:25 | troy_s | And if you implement OCIO, you gain them and alienate the Adobe users he is appeasing.
| |
22:25 | troy_s | So... no easy solution there. And certainly _zero_ chance of a solution that works for both without violating your design tenet of reductionism.
| |
22:25 | troy_s | philippej: Sense?
| |
22:25 | philippej | yes
| |
22:26 | philippej | seems that pdf/x worked because it was a consortium of companies
| |
22:26 | troy_s | I don't pretend to have an answer there. Personally, I loathe After Effects.
| |
22:26 | troy_s | PDF 1.4 is everything I've ever had to take to an offset with.
| |
22:26 | troy_s | Not X.
| |
22:26 | philippej | that or because adobe won almost all the printed media tools
| |
22:26 | troy_s | And I'm pretty certain that a vast bulk of print shops probably request 1.4 support.
| |
22:26 | troy_s | Why did they win?
| |
22:27 | troy_s | :)
| |
22:27 | philippej | pdf/x is a subset of 1.4 as far as I understand it
| |
22:27 | troy_s | The answer ties directly into this discussion. They won because they focused on graphic design.
| |
22:27 | troy_s | Hence ICCs etc.
| |
22:27 | philippej | I think partly because at some point, you could reliably bring a pdf file from indesign and be sure it would print correctly
| |
22:27 | troy_s | The issue is that when they branched out into motion picture stuffs, they chose ecosystem compatibilty as a design goal and thereby crippled all of photographic imagin henceforth.
| |
22:28 | troy_s | (Photoshop and After Effects are _deeply_ entrenched in graphic design legacy decisions.)
| |
22:28 | troy_s | Sucks entirely for photography work at this juncture, yet not too many can see why.
| |
22:29 | troy_s | I mean it doesn't take a rocket surgeon to see their blend modes and go "Uh... WTF". Linear blend for example, is a sort of a hackey workaround for the fact that their entire subsystem is designed around display referred imaging with non-linear at the core.
| |
22:29 | troy_s | (All legacy reasons.)
| |
22:36 | __anton__ | joined the channel | |
22:37 | __anton__ | troy_s: hi there, reading with great interest :) is there a photo-image handling software that you like (replacing Photoshop?)
| |
22:38 | lab-bot | sebastian created T119: clear boost cache of a particular page when a page has been altered in drupal. http://lab.apertus.org/T119
| |
22:38 | troy_s | __anton__: I'm probably worthy to ignore. But the question is moored in context and to appreciate why I made the statement, you sort of have to unwind all of the implicit crap that folks speak of without stating.
| |
22:38 | philippej | tbh I have no big hope for the adoption of a file format by the industry. Seems like it goes all the time in the other direction (more formats, more obsolescence, more "jails")
| |
22:39 | troy_s | philippej: The "industry" are the folks in most need of interchange. A file format wrapper isn't the most ideal ever, but that takes a little bit of understanding to see why. You simply are never dealing with codecs nor kitchen sink apps.
| |
22:39 | troy_s | __anton__: You are familiar with imaging basics I take it?
| |
22:39 | __anton__ | troy_s: I can probably ignored here as well, I'm a java programmer at a bank :) Yet I'm interested to hear if there's specifically a photo-image handling piece of software that works with screne-referred intensities
| |
22:40 | philippej | troy_s, do you include camera makers in the industry ?
| |
22:40 | jucar | left the channel | |
22:40 | __anton__ | troy_s: I'm pretty naive, yet I flatter myself thinking I understand what you're talking about
| |
22:41 | troy_s | __anton__: Do you have an image editor handy?
| |
22:41 | __anton__ | GIMP :)
| |
22:41 | troy_s | Perfect.
| |
22:41 | troy_s | Open it.
| |
22:41 | troy_s | Create a canvas. Fill it with a value of neutral grey of say 55 on a 255 8 bit scale.
| |
22:41 | lab-bot | philippej created T120: Provide better multicolumn and layout tools in drupal. http://lab.apertus.org/T120
| |
22:42 | troy_s | __anton__: Let me know when you have done this.
| |
22:43 | __anton__ | troy_s: done
| |
22:43 | troy_s | Make a fuzzy brush.
| |
22:43 | troy_s | A large one.
| |
22:43 | troy_s | Set R and B to full for a magenta
| |
22:43 | troy_s | Do three or four strokes.
| |
22:44 | troy_s | In a row like roman numerals.
| |
22:44 | troy_s | Done?
| |
22:45 | troy_s | (Must be a fuzzy brush of course.)
| |
22:45 | __anton__ | yup
| |
22:45 | troy_s | Now take a full green stroke and cross through all of them. Finally, take a full yellow stroke and also cross through them.
| |
22:45 | troy_s | So you have a picket fence basically
| |
22:45 | troy_s | Purple slats, full green and full yellow intensity cross beams.
| |
22:46 | lab-bot | sebastian created T121: find/build a method to publish individual news on different social media/newsletter/websites. http://lab.apertus.org/T121
| |
22:46 | __anton__ | got a fence :)
| |
22:46 | troy_s | Notice anything odd?
| |
22:46 | troy_s | In particular, around the edges of the blurry strokes?
| |
22:47 | troy_s | (To really make the final case, a full cyan (G + B) will reveal the issue across the purple.)
| |
22:47 | __anton__ | i see that my horizontal sausages got squeezed when they cross the vertical bars
| |
22:47 | troy_s | That isn't squeezing.
| |
22:48 | troy_s | If you look closely, you will see a distinct fringing around the purple.
| |
22:48 | troy_s | Black smokey fringing.
| |
22:48 | troy_s | Yes?
| |
22:48 | __anton__ | purple eats into yellow and green?
| |
22:49 | troy_s | You should see it with the saturated magenta crossing the grey, the green crossing the magenta, and the cyan.
| |
22:49 | troy_s | It doesn't eat.
| |
22:49 | troy_s | That's the result of a non-linear blend.
| |
22:49 | troy_s | Basically, when we ship colors to a display (LDR) we have to bend the intensities so they look what we expect in our HDR reality.
| |
22:50 | troy_s | That bending results in nasty ass bad math because the math being done is _not_ based on any sort of real-world physics, which we define as our baseline of "correct".
| |
22:50 | troy_s | So that fringing is because you are painting in a display referred model using non-linear color models.
| |
22:50 | __anton__ | yes smth about gamma I guess
| |
22:50 | __anton__ | so is there a tool for handling photos that uses linear?
| |
22:50 | troy_s | Indeed. Gamma is a uniform curve. A better term is transfer curve or tone response curve because not all non-linear curves are uniform. Some are a couple of formulas, some are more.
| |
22:51 | troy_s | Well the next bit of logic is
| |
22:51 | troy_s | "Ok I get it... we need linear (or to be more precise, radiometrically linear ratios) of data values to imitate the real world."
| |
22:51 | troy_s | Agree with that sort of basic explanation?
| |
22:51 | __anton__ | yes
| |
22:51 | troy_s | Now take your DSLR
| |
22:51 | troy_s | What kind do you have?
| |
22:51 | __anton__ | Nikon D40, Panasonic GH3
| |
22:51 | troy_s | Great.
| |
22:52 | troy_s | When you set it to JPEG, it mangles up the data it captures and spits out a display referred image from zero to one (or more specifically, 0-255 for an 8 bit image)
| |
22:52 | troy_s | It is _curved_
| |
22:52 | troy_s | and the ranges are fixed from minimum to maximum.
| |
22:52 | troy_s | But as you know, your D40, even though it isn't the uber super camera you could buy if you wanted to
| |
22:52 | troy_s | has... pretty damn good latitude. Agree?
| |
22:53 | troy_s | It can probably capture a decent six or seven stops above a middle grey card shot.
| |
22:53 | troy_s | Agree?
| |
22:53 | __anton__ | Yes, sounds about right
| |
22:53 | troy_s | In data now... If we expose such that a middle grey 18% reflectance card comes up at around say, middle of a JPEGs value set.
| |
22:54 | troy_s | It would be 120-124 ish
| |
22:54 | troy_s | If we _linearize_ it by flipping the sRGB curve around
| |
22:54 | troy_s | That value ends up at 0.2
| |
22:54 | troy_s | Or so.
| |
22:54 | troy_s | With me so far?
| |
22:54 | troy_s | Or did I lose you in the rabbit hole?
| |
22:54 | __anton__ | yeah, I started looking into this recently when you spoke of OpenEXR, I think I get it
| |
22:54 | troy_s | Good.
| |
22:54 | troy_s | Case problem 1:
| |
22:54 | troy_s | Our JPEG can hold how many stops of data?
| |
22:55 | troy_s | 0.2 + 0.2 = 0.4
| |
22:55 | troy_s | 1 stop
| |
22:55 | lab-bot | philippej updated subscribers of T121: find/build a method to publish individual news on different social media/newsletter/websites. http://lab.apertus.org/T121
| |
22:55 | troy_s | 0.4+0.4 = 0.8 two stops
| |
22:55 | troy_s | Only two and a bit stops!
| |
22:55 | troy_s | The camera has to take that massive dynamic range and bend it _really_ quite badly to cram it into a display referred jpeg.
| |
22:55 | troy_s | And we can't really get that data back because we only have the jpeg linearization to invert.
| |
22:55 | troy_s | If we take a raw data file
| |
22:56 | philippej | troy_s it would be really great to have articles on what you often explain here
| |
22:56 | troy_s | the values have a maximum and a minimum still, but we could, if we wanted, take the values and (because sensors are damn close to linear) dump them into an imaging editor.
| |
22:56 | troy_s | If we load them in GIMP, we are still stuck in display referred land - 0 to 1.
| |
22:57 | philippej | (I mean if someone could summarize this :-) )
| |
22:57 | troy_s | But as we saw in the GIMP demo, we _must_ have linearized values in order to do an over correctly.
| |
22:57 | troy_s | So if your Nikon D40 has say, six stops above middle grey
| |
22:57 | troy_s | and we map middle grey to 0.2
| |
22:57 | lab-bot | left the channel | |
22:57 | lab-bot | joined the channel | |
22:57 | troy_s | .2, .4, .8, 1.6, 3.2, 6.4, 12.8
| |
22:58 | philippej | good night everyone !
| |
22:58 | troy_s | Enter scene referred imaging.
| |
22:58 | troy_s | __anton__: Does that make the issue any clearer?
| |
22:58 | troy_s | In such an application you'd need to specifically control several things:
| |
22:58 | __anton__ | Thanks, Troy, that's a very accessible explanation
| |
22:58 | troy_s | 1) What is the curve of the input image? There are an infinite number. The app needs to let you set the inversion correctly.
| |
22:59 | philippej | left the channel | |
23:00 | troy_s | 2) Such an app would need to have a mapping for white and black points. Arbitrarily high values and lower ones to the display referred domain. That is, we have to creatively control how we transform from the scene referred domain to the display referred.
| |
23:00 | troy_s | But all of that crap I just sort of rolled through extremely swiftly, takes a bit of reasoning as to _why_ you need to control those things.
| |
23:00 | __anton__ | that's still great to read it
| |
23:00 | troy_s | And that isn't easily accessible to everyone, worse if you are only used to Photoshop or some other graphic-design centric application.
| |
23:01 | __anton__ | and it matches well with what you spoke about earlier
| |
23:01 | troy_s | Have you ever used an NLE app and done a dissolve?
| |
23:01 | troy_s | Probably right?
| |
23:01 | troy_s | Now ask yourself how those values were blended.
| |
23:01 | troy_s | Chances are they used the curved, non-linear values in the image carrier format, and guess what...
| |
23:01 | troy_s | that dissolve is as broken as your magenta over grey
| |
23:01 | __anton__ | :)
| |
23:02 | troy_s | Hence the idea of a video dissovle versus say, a film dissolve (or more accurately, a nonlinear blend versus a more correct radiometrically linear blend.)
| |
23:02 | troy_s | Interesting stuff if you are into imaging for certain. Not so much if you aren't.
| |
23:02 | troy_s | :)
| |
23:02 | __anton__ | I understand there are decent tools out there that work with video using scene referred imaging
| |
23:02 | __anton__ | What about photo?
| |
23:02 | troy_s | Yep.
| |
23:03 | troy_s | Very very very very very few to the best of my knowledge.
| |
23:03 | troy_s | Because A) most folks are used to things like Photoshop, and when you say something like 'scene referred' their head spins.
| |
23:03 | troy_s | B) the above "simple" explanation takes quite a bit of grounding foundation.
| |
23:03 | troy_s | C) Many folks are used to Aperture or Lightroom
| |
23:03 | troy_s | Which are specifically, processing tools.
| |
23:04 | troy_s | When processing, the model is more than often smart to use a display referred form.
| |
23:04 | troy_s | Which opens up a huge can of worms
| |
23:04 | troy_s | As in ... can we mix and match our models of scene versus display?
| |
23:04 | troy_s | What happens if we do?
| |
23:04 | troy_s | Or do we divide the manipulations further into two classes:
| |
23:04 | troy_s | A) Image manipulation / blending / over / etc.
| |
23:04 | troy_s | B) Color Manipulation / grading.
| |
23:05 | troy_s | and suddenly it begins to make sense as to why you never grade mid pipeline, but rather only at the very very end.
| |
23:05 | Bertl | even blending or resizing doesn't give correct results if the space is nonlinear (unless the blending math corrects for it)
| |
23:06 | troy_s | Bingo.
| |
23:06 | troy_s | Nor blurs.
| |
23:06 | troy_s | Anything with interpolation.
| |
23:06 | troy_s | Because the interpolation requires a _traversal_ across a tri-color model.
| |
23:06 | troy_s | And anything that requires traversal is utterly going to be broken in a non-linear format.
| |
23:06 | troy_s | __anton__: Kind of interesting stuff eh?
| |
23:06 | troy_s | __anton__: Here's a cool question...
| |
23:07 | troy_s | Think of a dissolve briefly.
| |
23:07 | troy_s | Now visualize making a dissolve that runs a diagonal line from top to bottom ... a dissolve down to another strip.
| |
23:07 | troy_s | Being a LINEAR line, we can immediately say "Wait a minute... that means it is a perceptual linear because at 50% of the line we expect 50% of our display referred perceptual value to be about half intensity"
| |
23:08 | troy_s | So how the hell do you do a dissolve using that sort of GUI element?
| |
23:08 | troy_s | If you _linearize_ the data, 50% along the line will have a dissolve that isn't perceptual at all, but rather quite a bit brighter at 50%.
| |
23:08 | lab-bot | Rebelj12a created T122: Reorganization. http://lab.apertus.org/T122
| |
23:08 | troy_s | Yet we _must_ have linearized data to do the dissolve 'correctly'.
| |
23:08 | troy_s | :)
| |
23:09 | troy_s | (Hint: You have to effectively bend the time curve)
| |
23:09 | troy_s | If you have a linear diagonal time curve, you have to bend that in accordance with your non-linear output.
| |
23:10 | g3gg0 | left the channel | |
23:10 | troy_s | And likewise, gradients are challenging as hell. If you traverse from red to green, you ideally have to do that on a linear model in order to get correct blending, but if you go from dark red to say bright green, you have to deal with the color traversal (the red to the green) in isolation from the intensity traversal (dark to light)
| |
23:10 | troy_s | Plenty of interesting stuff.
| |
23:10 | lab-bot | Rebelj12a created T123: Add section or page for Open Source Arduino. http://lab.apertus.org/T123
| |
23:13 | __anton__ | heh, took me a while to take it in :) so time-curve is going to be different for each pixel in dissolve?
| |
23:17 | Rebelj12a | joined the channel | |
23:17 | Rebelj12a | *dying* ... ... ... slowly
| |
23:18 | troy_s | Rebelj12a: LOL
| |
23:19 | troy_s | __anton__: Just think of a time line diagonal being perceptual. That is, at 0.5 distance, you expect it to be about 50% perceptual. The way to get a curve back to the linear values is via an inverted curve transform. So in the case of sRGB, 0.5 ends up around 0.2
| |
23:19 | Rebelj12a | Usually the day after a shoot isn't the worst, its the 2nd day after that kills me. However today.. its really bad for some reason. Might be because my arm actually physically gave out last night. Might have pushed too hard.
| |
23:20 | troy_s | __anton__: Make sense?
| |
23:20 | __anton__ | troy_s: yes, it does. Indeed very interesting :)
| |
23:20 | troy_s | __anton__: So a conforming solution would need to take the spline curves and figure out if they are perceptual or curved, and interpret accordingly.
| |
23:20 | Rebelj12a | Troy is full of fantastically interesting and informative information on color, gamma and curves.
| |
23:21 | troy_s | LOL. I wish.
| |
23:21 | troy_s | Just a few years of making mistakes.
| |
23:21 | troy_s | And doing the old "Good god that looks like ass. What the hell is going on?"
| |
23:22 | troy_s | __anton__: A gradient is an identical dilemma that forces us to examine the two halves of color: Chromaticity (the hue or color of a color) and Luminance (the intensity of the color)
| |
23:23 | troy_s | __anton__: Little factoids come to light that are often overlooked when you are happily painting in Photoshop or something, such as ... The base color red you are looking at _never_ changes in an RGB system. It is the _same_ color red from 0.000000000001 to 1.0.
| |
23:24 | __anton__ | there was a time when I wondered why anybody would need anything apart from RGB triplets at all. Hue? Who needs a hue? :-)
| |
23:24 | troy_s | Rebelj12a: When you logging on it?
| |
23:24 | se6astian | new article released
| |
23:24 | se6astian | https://apertus.org/axiom-beta-progress-article-november-2014
| |
23:24 | se6astian | time for bed now
| |
23:24 | se6astian | good night!
| |
23:24 | se6astian | changed nick to: se6astian|away
| |
23:24 | Rebelj12a | night sebastian
| |
23:26 | Rebelj12a | The online simulation remote controller structure. Would that be for use on a smartphone? Or properietary touch screen remote?
| |
23:27 | troy_s | http://assets.pro.sony.eu/Web/menu-simulator/index.html
| |
23:30 | Rebelj12a | No no i mean the end product remote
| |
23:31 | Rebelj12a | Android app, ios app, or proprietary remote/open source remote
| |
23:31 | Rebelj12a | Basically, I want to know what kind of hardware interface screen im working with first before making suggestions.
| |
23:32 | troy_s | Rebelj12a: I suspect that is going to be the menu on the side of the camera.
| |
23:32 | troy_s | There was a thing already broken down on the wiki with some elements
| |
23:32 | troy_s | (Like the CC axis (CDL / LUTs per view / eyepiece /etc.)
| |
23:33 | Rebelj12a | ah ok, ill check the wiki. Yes i know how the remote system and simulators work, ive played with the arri
| |
23:33 | Rebelj12a | i.e. pushed shiny buttons
| |
23:34 | Rebelj12a | No no i meant in the progress article sebastion just posted. Theres the Axiom Beta Controller Menu Simulator. The Axiom Beta Controller will it be ios, android, or open source / proprietary.
| |
23:34 | Rebelj12a | If so what hardware.
| |
23:35 | Bertl | most likely it will be PIC32MZ based, with an SPI attached LCD and a bunch of buttons/dials
| |
23:35 | Bertl | think arduino, just microchip instead of atmel
| |
23:36 | Rebelj12a | Ok will the screen support multitouch?
| |
23:36 | Bertl | unlikely
| |
23:36 | Rebelj12a | too bad ok then
| |
23:36 | Bertl | most likely it will not support touch at all, thus the buttons
| |
23:36 | troy_s | Does it need multitouch?
| |
23:37 | troy_s | Thank god.
| |
23:37 | Rebelj12a | oh buttons ok
| |
23:37 | troy_s | Rebelj12a: You ought to be anti-touch too having that blasted BMCC.
| |
23:37 | Rebelj12a | It mentioned remote so
| |
23:37 | Rebelj12a | I am, however, on a remote such as an android app.
| |
23:37 | troy_s | Sure.
| |
23:37 | Rebelj12a | Multitouch is 10x better. especially being that small
| |
23:37 | Rebelj12a | on the camera itself, its the devil
| |
23:37 | Bertl | yes, the android app will use touch of course :)
| |
23:37 | Rebelj12a | Ah ok ^_^
| |
23:38 | Bertl | but it will, similar to the simulator, make it look like buttons :)
| |
23:38 | Bertl | (at least that's what I understood :)
| |
23:39 | Rebelj12a | So, taking design suggestions on it?
| |
23:40 | troy_s | ala http://assets.pro.sony.eu/Web/menu-simulator/index.html
| |
23:41 | troy_s | and http://www.arri.de/camera/alexa/tools/alexa_camera_simulator.html
| |
23:42 | Rebelj12a | my favorite the alexa
| |
23:42 | Rebelj12a | er Arri
| |
23:44 | Rebelj12a | That wheel mmm
| |
23:48 | Rebelj12a | Bertl is there a list of percieved functions that are going to be in the release OS for Beta? Oh wait ill check the wiki
| |
23:49 | Bertl | probably nothing fixed yet, too early
| |
23:50 | Rebelj12a | Ah ok, in designing the menu its helpful to know the general layout and functions that are needed.
| |
23:52 | Bertl | but do not limit yourself by planned features, we will sooner or later get all features which make sense
| |
23:52 | Bertl | (and a few more which don't :)
| |
23:53 | Rebelj12a | Well yes, menu design and feature design go hand in hand with fixed buttons and the like. Ah ill just plan design for future easier that way.
|