23:15 | comradekingu | left the channel | |
00:18 | pizthewiz | left the channel | |
00:20 | Bertl | off to bed now ... have a good one everyone!
| |
00:20 | Bertl | changed nick to: Bertl_zZ
| |
01:56 | pizthewiz | joined the channel | |
02:04 | pizthewiz | left the channel | |
02:15 | pizthewiz | joined the channel | |
02:15 | pizthewiz | left the channel | |
03:32 | intracube | changed nick to: intracube_
| |
07:16 | g3gg0 | joined the channel | |
08:07 | Andrej_ | left the channel | |
08:22 | se6astian|away | changed nick to: se6astian
| |
08:23 | se6astian | good morning
| |
08:28 | lab-bot | sebastian created T112: Standard icons for apertus.org project pages to github and wiki. http://lab.apertus.org/T112
| |
08:58 | lab-bot | sebastian created T113: CMV12000 PCB module pogress. http://lab.apertus.org/T113
| |
08:58 | lab-bot | sebastian created T114: Remote Control Simulator draft. http://lab.apertus.org/T114
| |
09:02 | se6astian | updated wishlist table
| |
09:02 | se6astian | https://www.apertus.org/en/dictator
| |
09:03 | se6astian | gotta go
| |
09:03 | se6astian | changed nick to: se6astian|away
| |
12:14 | aombk2 | changed nick to: aombk
| |
12:50 | PhilCee | joined the channel | |
12:51 | Bertl_zZ | changed nick to: Bertl
| |
12:51 | Bertl | morning folks!
| |
12:58 | lab-bot | davidak created T115: Breadcrumb Links don't work. http://lab.apertus.org/T115
| |
13:30 | PhilCee | left the channel | |
15:04 | Juicyfruit | changed nick to: Juicman42
| |
15:04 | Juicman42 | changed nick to: Juiceman42
| |
15:08 | Juiceman42 | changed nick to: Juicyfruit
| |
15:13 | slikdigit | joined the channel | |
15:47 | bassam__ | joined the channel | |
15:47 | slikdigit | left the channel | |
15:47 | bassam__ | changed nick to: slikdigit
| |
15:47 | slikdigit | left the channel | |
15:47 | slikdigit | joined the channel | |
15:52 | slikdigit | left the channel | |
17:30 | intracube_ | changed nick to: intracube
| |
17:34 | se6astian|away | changed nick to: se6astian
| |
17:35 | wescotte | joined the channel | |
17:37 | se6astian | good evening
| |
17:38 | intracube | evening se6astian
| |
17:44 | Bertl | evening!
| |
18:22 | philippej | joined the channel | |
19:00 | Bertl | hey philippej!
| |
19:03 | philippej | Hello Bertl :-)
| |
19:04 | lab-bot | Bertl created T118: Check-in CMOSIS Datasheets to the Beta Repository. http://lab.apertus.org/T118
| |
19:16 | FergusL | left the channel | |
19:17 | FergusL | joined the channel | |
19:26 | lab-bot | sebastian closed T118: Check-in CMOSIS Datasheets to the Beta Repository as "Resolved". http://lab.apertus.org/T118
| |
19:52 | lab-bot | philippej closed T86: can we change the default applications in phabricator left hand menu? as "Wontfix". http://lab.apertus.org/T86
| |
19:52 | philippej | can anyone tell my what "spite" mean in the realm of bug / task status ?
| |
19:54 | troy_s | philippej: I suspect it is the sarcastic nature of Phabricator
| |
19:55 | troy_s | philippej: â So, I think features like these actually make the software better, which is why you can close tasks "Out of Spite" in Maniphest and the submit button in Differential is labeled "Clowncopterize". Plus, they're fun to build.â
| 19:55 | philippej | looks for the translation of out of spite
|
19:56 | philippej | I see
| |
19:56 | troy_s | philippej: I believe that particular one is malicious closure
| |
19:56 | troy_s | As in if you kept bothering me about something and I tried to make my case but you kept at it
| |
19:56 | troy_s | Out of Spite would be a good reason
| |
19:56 | philippej | :-)
| |
19:56 | troy_s | It is close to a wontfix I would think
| |
19:56 | troy_s | But more edged.
| |
19:56 | philippej | we could create some additional status if needed btw
| |
19:57 | troy_s | I think it is hilarious
| |
19:57 | troy_s | It is a Linus Torvalds closure tag really. :)
| |
19:57 | troy_s | Without the f-bombs
| |
19:57 | philippej | I guess it's stuff like that that got me instantly hooked to phabricator
| |
19:57 | troy_s | (Although I could see a closure due to a nasty hard-to-find bug being spiteful too.
| |
19:57 | troy_s | Yep
| |
19:57 | philippej | this and the fact that it's particularly well written and thought
| |
19:58 | troy_s | philippej: â Thanks to Phabricator I can close the guy everyone hates' issues out of spite.â
| |
19:58 | philippej | on install, it gives you advice on everything, like "the size of your mysql cache is a bit too small", etc ...
| |
19:58 | philippej | haha
| |
19:59 | troy_s | â deliberately hurt, annoy, or offend (someone).â
| |
19:59 | troy_s | That is the definition of spite from Google's top search.
| |
19:59 | troy_s | Hopefully that translates better for you.
| |
20:00 | philippej | perfectly, thanks
| |
20:00 | troy_s | Breadcrumb links not working philippej ?
| |
20:01 | philippej | breadcrumb are tricky in drupal ...
| |
20:01 | Bertl | troy_s: I think they should work, but I haven't seen any labels for that in the source of the pages
| |
20:01 | troy_s | The hash tag one works here.
| |
20:01 | philippej | you have at least 10 add on modules to manage
| |
20:01 | troy_s | The one without doesn't.
| |
20:02 | philippej | if a page is part of the menu the breadcrumb will show
| |
20:02 | troy_s | I just saw the Phab bug
| |
20:02 | philippej | else it won't
| |
20:02 | troy_s | Bertl: Back to the dungeon you!
| |
20:02 | philippej | what would be a decent behavior, that's the question
| |
20:02 | troy_s | J'adore Phabricator.
| |
20:03 | troy_s | This is slick http://files.apertus.org/controller-simulator/
| |
20:04 | philippej | ça rime en plus -> it rhymes :-)
| |
20:04 | troy_s | Sony's is extremely well designed graphically speaking.
| |
20:05 | FergusL | left the channel | |
20:06 | FergusL | joined the channel | |
20:08 | lab-bot | left the channel | |
20:08 | lab-bot | joined the channel | |
20:09 | philippej | lab bot will now be a little bit more verbose, it's an experiment, let me know if the guy talks too much
| |
20:10 | Bertl | lab-bot: what's up pal?
| |
20:11 | philippej | unfortunately it's still as dumb as before
| |
20:11 | philippej | but it will notify some more tasks status changes
| |
20:32 | FergusL | left the channel | |
20:33 | FergusL | joined the channel | |
20:33 | lab-bot | philippej closed T30: apertus.org: email notification for comment publish as "Resolved". http://lab.apertus.org/T30
| |
20:44 | troy_s | On MOX http://major-kong.blogspot.co.uk/2014/10/a-mox-on-both-your-house-formats.html
| |
20:44 | troy_s | (That is by a Foundry developer who worked on Nuke of course)
| |
20:47 | lab-bot | sebastian reopened T30: apertus.org: email notification for comment publish as "Open". http://lab.apertus.org/T30
| |
21:07 | lab-bot | philippej claimed T112: Standard icons for apertus.org project pages to github and wiki. http://lab.apertus.org/T112
| |
21:19 | philippej | About MOX, I see the need for some kind of pdf-x for image sequences (fail- and idiot- proof, reduced feature set from a more general purpose format)
| |
21:19 | philippej | very interesting to see where it ends up
| |
21:20 | troy_s | philippej: I worry that MOX has too much of a myopic view on Adobe products. ICCs etc.
| |
21:20 | troy_s | Really awful idea(TM)
| |
21:21 | troy_s | philippej: Your suggestion is a good point, but at that very point you make an ideological judgment on who / what is your target. To optimize for that will ostracize the other designed targets.
| |
21:21 | troy_s | philippej: It speaks volumes that he got the idea from the After Effects forum.
| |
21:22 | philippej | I'm not taking the pdf/x example because it as created by adobe, but more for it's "less is more" philosophy that works extremely well.
| |
21:22 | troy_s | philippej: Whereas Bink's commentary is from the highest end of imaging, and if the high end doesn't adopt it, it's as good as dead.
| |
21:22 | troy_s | Less is never more.
| |
21:22 | troy_s | Less is always less.
| |
21:22 | troy_s | Fsck Mies Van Der Rohe
| |
21:22 | philippej | bring an interactive pdf to the print shop :-)
| |
21:23 | troy_s | (Uniquely Industrial-Modernist. :) )
| |
21:23 | troy_s | PDFs go to the print shop all the time, but not because they are reduced.
| |
21:23 | philippej | then let's say, reducing features sometimes ease of communication between parties
| |
21:23 | troy_s | Rather they tend to target the lowest common denominator.
| |
21:23 | troy_s | Sure. And I totally agree.
| |
21:23 | troy_s | The problem is your implied "parties" there.
| |
21:24 | troy_s | Everyone (much like Apertus) is going to have an opinion. The challenging question is who do you listen to?
| |
21:24 | troy_s | (Depends on goals of course, but a minor design decision can have rather massive implications. For example, if Brendan goes with ICC, there is a less than zero percentage chance that any post house anywhere will touch the format. It is a still-born.)
| |
21:25 | troy_s | And if you implement OCIO, you gain them and alienate the Adobe users he is appeasing.
| |
21:25 | troy_s | So... no easy solution there. And certainly _zero_ chance of a solution that works for both without violating your design tenet of reductionism.
| |
21:25 | troy_s | philippej: Sense?
| |
21:25 | philippej | yes
| |
21:26 | philippej | seems that pdf/x worked because it was a consortium of companies
| |
21:26 | troy_s | I don't pretend to have an answer there. Personally, I loathe After Effects.
| |
21:26 | troy_s | PDF 1.4 is everything I've ever had to take to an offset with.
| |
21:26 | troy_s | Not X.
| |
21:26 | philippej | that or because adobe won almost all the printed media tools
| |
21:26 | troy_s | And I'm pretty certain that a vast bulk of print shops probably request 1.4 support.
| |
21:26 | troy_s | Why did they win?
| |
21:27 | troy_s | :)
| |
21:27 | philippej | pdf/x is a subset of 1.4 as far as I understand it
| |
21:27 | troy_s | The answer ties directly into this discussion. They won because they focused on graphic design.
| |
21:27 | troy_s | Hence ICCs etc.
| |
21:27 | philippej | I think partly because at some point, you could reliably bring a pdf file from indesign and be sure it would print correctly
| |
21:27 | troy_s | The issue is that when they branched out into motion picture stuffs, they chose ecosystem compatibilty as a design goal and thereby crippled all of photographic imagin henceforth.
| |
21:28 | troy_s | (Photoshop and After Effects are _deeply_ entrenched in graphic design legacy decisions.)
| |
21:28 | troy_s | Sucks entirely for photography work at this juncture, yet not too many can see why.
| |
21:29 | troy_s | I mean it doesn't take a rocket surgeon to see their blend modes and go "Uh... WTF". Linear blend for example, is a sort of a hackey workaround for the fact that their entire subsystem is designed around display referred imaging with non-linear at the core.
| |
21:29 | troy_s | (All legacy reasons.)
| |
21:36 | __anton__ | joined the channel | |
21:37 | __anton__ | troy_s: hi there, reading with great interest :) is there a photo-image handling software that you like (replacing Photoshop?)
| |
21:38 | lab-bot | sebastian created T119: clear boost cache of a particular page when a page has been altered in drupal. http://lab.apertus.org/T119
| |
21:38 | troy_s | __anton__: I'm probably worthy to ignore. But the question is moored in context and to appreciate why I made the statement, you sort of have to unwind all of the implicit crap that folks speak of without stating.
| |
21:38 | philippej | tbh I have no big hope for the adoption of a file format by the industry. Seems like it goes all the time in the other direction (more formats, more obsolescence, more "jails")
| |
21:39 | troy_s | philippej: The "industry" are the folks in most need of interchange. A file format wrapper isn't the most ideal ever, but that takes a little bit of understanding to see why. You simply are never dealing with codecs nor kitchen sink apps.
| |
21:39 | troy_s | __anton__: You are familiar with imaging basics I take it?
| |
21:39 | __anton__ | troy_s: I can probably ignored here as well, I'm a java programmer at a bank :) Yet I'm interested to hear if there's specifically a photo-image handling piece of software that works with screne-referred intensities
| |
21:40 | philippej | troy_s, do you include camera makers in the industry ?
| |
21:40 | jucar | left the channel | |
21:40 | __anton__ | troy_s: I'm pretty naive, yet I flatter myself thinking I understand what you're talking about
| |
21:41 | troy_s | __anton__: Do you have an image editor handy?
| |
21:41 | __anton__ | GIMP :)
| |
21:41 | troy_s | Perfect.
| |
21:41 | troy_s | Open it.
| |
21:41 | troy_s | Create a canvas. Fill it with a value of neutral grey of say 55 on a 255 8 bit scale.
| |
21:41 | lab-bot | philippej created T120: Provide better multicolumn and layout tools in drupal. http://lab.apertus.org/T120
| |
21:42 | troy_s | __anton__: Let me know when you have done this.
| |
21:43 | __anton__ | troy_s: done
| |
21:43 | troy_s | Make a fuzzy brush.
| |
21:43 | troy_s | A large one.
| |
21:43 | troy_s | Set R and B to full for a magenta
| |
21:43 | troy_s | Do three or four strokes.
| |
21:44 | troy_s | In a row like roman numerals.
| |
21:44 | troy_s | Done?
| |
21:45 | troy_s | (Must be a fuzzy brush of course.)
| |
21:45 | __anton__ | yup
| |
21:45 | troy_s | Now take a full green stroke and cross through all of them. Finally, take a full yellow stroke and also cross through them.
| |
21:45 | troy_s | So you have a picket fence basically
| |
21:45 | troy_s | Purple slats, full green and full yellow intensity cross beams.
| |
21:46 | lab-bot | sebastian created T121: find/build a method to publish individual news on different social media/newsletter/websites. http://lab.apertus.org/T121
| |
21:46 | __anton__ | got a fence :)
| |
21:46 | troy_s | Notice anything odd?
| |
21:46 | troy_s | In particular, around the edges of the blurry strokes?
| |
21:47 | troy_s | (To really make the final case, a full cyan (G + B) will reveal the issue across the purple.)
| |
21:47 | __anton__ | i see that my horizontal sausages got squeezed when they cross the vertical bars
| |
21:47 | troy_s | That isn't squeezing.
| |
21:48 | troy_s | If you look closely, you will see a distinct fringing around the purple.
| |
21:48 | troy_s | Black smokey fringing.
| |
21:48 | troy_s | Yes?
| |
21:48 | __anton__ | purple eats into yellow and green?
| |
21:49 | troy_s | You should see it with the saturated magenta crossing the grey, the green crossing the magenta, and the cyan.
| |
21:49 | troy_s | It doesn't eat.
| |
21:49 | troy_s | That's the result of a non-linear blend.
| |
21:49 | troy_s | Basically, when we ship colors to a display (LDR) we have to bend the intensities so they look what we expect in our HDR reality.
| |
21:50 | troy_s | That bending results in nasty ass bad math because the math being done is _not_ based on any sort of real-world physics, which we define as our baseline of "correct".
| |
21:50 | troy_s | So that fringing is because you are painting in a display referred model using non-linear color models.
| |
21:50 | __anton__ | yes smth about gamma I guess
| |
21:50 | __anton__ | so is there a tool for handling photos that uses linear?
| |
21:50 | troy_s | Indeed. Gamma is a uniform curve. A better term is transfer curve or tone response curve because not all non-linear curves are uniform. Some are a couple of formulas, some are more.
| |
21:51 | troy_s | Well the next bit of logic is
| |
21:51 | troy_s | "Ok I get it... we need linear (or to be more precise, radiometrically linear ratios) of data values to imitate the real world."
| |
21:51 | troy_s | Agree with that sort of basic explanation?
| |
21:51 | __anton__ | yes
| |
21:51 | troy_s | Now take your DSLR
| |
21:51 | troy_s | What kind do you have?
| |
21:51 | __anton__ | Nikon D40, Panasonic GH3
| |
21:51 | troy_s | Great.
| |
21:52 | troy_s | When you set it to JPEG, it mangles up the data it captures and spits out a display referred image from zero to one (or more specifically, 0-255 for an 8 bit image)
| |
21:52 | troy_s | It is _curved_
| |
21:52 | troy_s | and the ranges are fixed from minimum to maximum.
| |
21:52 | troy_s | But as you know, your D40, even though it isn't the uber super camera you could buy if you wanted to
| |
21:52 | troy_s | has... pretty damn good latitude. Agree?
| |
21:53 | troy_s | It can probably capture a decent six or seven stops above a middle grey card shot.
| |
21:53 | troy_s | Agree?
| |
21:53 | __anton__ | Yes, sounds about right
| |
21:53 | troy_s | In data now... If we expose such that a middle grey 18% reflectance card comes up at around say, middle of a JPEGs value set.
| |
21:54 | troy_s | It would be 120-124 ish
| |
21:54 | troy_s | If we _linearize_ it by flipping the sRGB curve around
| |
21:54 | troy_s | That value ends up at 0.2
| |
21:54 | troy_s | Or so.
| |
21:54 | troy_s | With me so far?
| |
21:54 | troy_s | Or did I lose you in the rabbit hole?
| |
21:54 | __anton__ | yeah, I started looking into this recently when you spoke of OpenEXR, I think I get it
| |
21:54 | troy_s | Good.
| |
21:54 | troy_s | Case problem 1:
| |
21:54 | troy_s | Our JPEG can hold how many stops of data?
| |
21:55 | troy_s | 0.2 + 0.2 = 0.4
| |
21:55 | troy_s | 1 stop
| |
21:55 | lab-bot | philippej updated subscribers of T121: find/build a method to publish individual news on different social media/newsletter/websites. http://lab.apertus.org/T121
| |
21:55 | troy_s | 0.4+0.4 = 0.8 two stops
| |
21:55 | troy_s | Only two and a bit stops!
| |
21:55 | troy_s | The camera has to take that massive dynamic range and bend it _really_ quite badly to cram it into a display referred jpeg.
| |
21:55 | troy_s | And we can't really get that data back because we only have the jpeg linearization to invert.
| |
21:55 | troy_s | If we take a raw data file
| |
21:56 | philippej | troy_s it would be really great to have articles on what you often explain here
| |
21:56 | troy_s | the values have a maximum and a minimum still, but we could, if we wanted, take the values and (because sensors are damn close to linear) dump them into an imaging editor.
| |
21:56 | troy_s | If we load them in GIMP, we are still stuck in display referred land - 0 to 1.
| |
21:57 | philippej | (I mean if someone could summarize this :-) )
| |
21:57 | troy_s | But as we saw in the GIMP demo, we _must_ have linearized values in order to do an over correctly.
| |
21:57 | troy_s | So if your Nikon D40 has say, six stops above middle grey
| |
21:57 | troy_s | and we map middle grey to 0.2
| |
21:57 | lab-bot | left the channel | |
21:57 | lab-bot | joined the channel | |
21:57 | troy_s | .2, .4, .8, 1.6, 3.2, 6.4, 12.8
| |
21:58 | philippej | good night everyone !
| |
21:58 | troy_s | Enter scene referred imaging.
| |
21:58 | troy_s | __anton__: Does that make the issue any clearer?
| |
21:58 | troy_s | In such an application you'd need to specifically control several things:
| |
21:58 | __anton__ | Thanks, Troy, that's a very accessible explanation
| |
21:58 | troy_s | 1) What is the curve of the input image? There are an infinite number. The app needs to let you set the inversion correctly.
| |
21:59 | philippej | left the channel | |
22:00 | troy_s | 2) Such an app would need to have a mapping for white and black points. Arbitrarily high values and lower ones to the display referred domain. That is, we have to creatively control how we transform from the scene referred domain to the display referred.
| |
22:00 | troy_s | But all of that crap I just sort of rolled through extremely swiftly, takes a bit of reasoning as to _why_ you need to control those things.
| |
22:00 | __anton__ | that's still great to read it
| |
22:00 | troy_s | And that isn't easily accessible to everyone, worse if you are only used to Photoshop or some other graphic-design centric application.
| |
22:01 | __anton__ | and it matches well with what you spoke about earlier
| |
22:01 | troy_s | Have you ever used an NLE app and done a dissolve?
| |
22:01 | troy_s | Probably right?
| |
22:01 | troy_s | Now ask yourself how those values were blended.
| |
22:01 | troy_s | Chances are they used the curved, non-linear values in the image carrier format, and guess what...
| |
22:01 | troy_s | that dissolve is as broken as your magenta over grey
| |
22:01 | __anton__ | :)
| |
22:02 | troy_s | Hence the idea of a video dissovle versus say, a film dissolve (or more accurately, a nonlinear blend versus a more correct radiometrically linear blend.)
| |
22:02 | troy_s | Interesting stuff if you are into imaging for certain. Not so much if you aren't.
| |
22:02 | troy_s | :)
| |
22:02 | __anton__ | I understand there are decent tools out there that work with video using scene referred imaging
| |
22:02 | __anton__ | What about photo?
| |
22:02 | troy_s | Yep.
| |
22:03 | troy_s | Very very very very very few to the best of my knowledge.
| |
22:03 | troy_s | Because A) most folks are used to things like Photoshop, and when you say something like 'scene referred' their head spins.
| |
22:03 | troy_s | B) the above "simple" explanation takes quite a bit of grounding foundation.
| |
22:03 | troy_s | C) Many folks are used to Aperture or Lightroom
| |
22:03 | troy_s | Which are specifically, processing tools.
| |
22:04 | troy_s | When processing, the model is more than often smart to use a display referred form.
| |
22:04 | troy_s | Which opens up a huge can of worms
| |
22:04 | troy_s | As in ... can we mix and match our models of scene versus display?
| |
22:04 | troy_s | What happens if we do?
| |
22:04 | troy_s | Or do we divide the manipulations further into two classes:
| |
22:04 | troy_s | A) Image manipulation / blending / over / etc.
| |
22:04 | troy_s | B) Color Manipulation / grading.
| |
22:05 | troy_s | and suddenly it begins to make sense as to why you never grade mid pipeline, but rather only at the very very end.
| |
22:05 | Bertl | even blending or resizing doesn't give correct results if the space is nonlinear (unless the blending math corrects for it)
| |
22:06 | troy_s | Bingo.
| |
22:06 | troy_s | Nor blurs.
| |
22:06 | troy_s | Anything with interpolation.
| |
22:06 | troy_s | Because the interpolation requires a _traversal_ across a tri-color model.
| |
22:06 | troy_s | And anything that requires traversal is utterly going to be broken in a non-linear format.
| |
22:06 | troy_s | __anton__: Kind of interesting stuff eh?
| |
22:06 | troy_s | __anton__: Here's a cool question...
| |
22:07 | troy_s | Think of a dissolve briefly.
| |
22:07 | troy_s | Now visualize making a dissolve that runs a diagonal line from top to bottom ... a dissolve down to another strip.
| |
22:07 | troy_s | Being a LINEAR line, we can immediately say "Wait a minute... that means it is a perceptual linear because at 50% of the line we expect 50% of our display referred perceptual value to be about half intensity"
| |
22:08 | troy_s | So how the hell do you do a dissolve using that sort of GUI element?
| |
22:08 | troy_s | If you _linearize_ the data, 50% along the line will have a dissolve that isn't perceptual at all, but rather quite a bit brighter at 50%.
| |
22:08 | lab-bot | Rebelj12a created T122: Reorganization. http://lab.apertus.org/T122
| |
22:08 | troy_s | Yet we _must_ have linearized data to do the dissolve 'correctly'.
| |
22:08 | troy_s | :)
| |
22:09 | troy_s | (Hint: You have to effectively bend the time curve)
| |
22:09 | troy_s | If you have a linear diagonal time curve, you have to bend that in accordance with your non-linear output.
| |
22:10 | g3gg0 | left the channel | |
22:10 | troy_s | And likewise, gradients are challenging as hell. If you traverse from red to green, you ideally have to do that on a linear model in order to get correct blending, but if you go from dark red to say bright green, you have to deal with the color traversal (the red to the green) in isolation from the intensity traversal (dark to light)
| |
22:10 | troy_s | Plenty of interesting stuff.
| |
22:10 | lab-bot | Rebelj12a created T123: Add section or page for Open Source Arduino. http://lab.apertus.org/T123
| |
22:13 | __anton__ | heh, took me a while to take it in :) so time-curve is going to be different for each pixel in dissolve?
| |
22:17 | Rebelj12a | joined the channel | |
22:17 | Rebelj12a | *dying* ... ... ... slowly
| |
22:18 | troy_s | Rebelj12a: LOL
| |
22:19 | troy_s | __anton__: Just think of a time line diagonal being perceptual. That is, at 0.5 distance, you expect it to be about 50% perceptual. The way to get a curve back to the linear values is via an inverted curve transform. So in the case of sRGB, 0.5 ends up around 0.2
| |
22:19 | Rebelj12a | Usually the day after a shoot isn't the worst, its the 2nd day after that kills me. However today.. its really bad for some reason. Might be because my arm actually physically gave out last night. Might have pushed too hard.
| |
22:20 | troy_s | __anton__: Make sense?
| |
22:20 | __anton__ | troy_s: yes, it does. Indeed very interesting :)
| |
22:20 | troy_s | __anton__: So a conforming solution would need to take the spline curves and figure out if they are perceptual or curved, and interpret accordingly.
| |
22:20 | Rebelj12a | Troy is full of fantastically interesting and informative information on color, gamma and curves.
| |
22:21 | troy_s | LOL. I wish.
| |
22:21 | troy_s | Just a few years of making mistakes.
| |
22:21 | troy_s | And doing the old "Good god that looks like ass. What the hell is going on?"
| |
22:22 | troy_s | __anton__: A gradient is an identical dilemma that forces us to examine the two halves of color: Chromaticity (the hue or color of a color) and Luminance (the intensity of the color)
| |
22:23 | troy_s | __anton__: Little factoids come to light that are often overlooked when you are happily painting in Photoshop or something, such as ... The base color red you are looking at _never_ changes in an RGB system. It is the _same_ color red from 0.000000000001 to 1.0.
| |
22:24 | __anton__ | there was a time when I wondered why anybody would need anything apart from RGB triplets at all. Hue? Who needs a hue? :-)
| |
22:24 | troy_s | Rebelj12a: When you logging on it?
| |
22:24 | se6astian | new article released
| |
22:24 | se6astian | https://apertus.org/axiom-beta-progress-article-november-2014
| |
22:24 | se6astian | time for bed now
| |
22:24 | se6astian | good night!
| |
22:24 | se6astian | changed nick to: se6astian|away
| |
22:24 | Rebelj12a | night sebastian
| |
22:26 | Rebelj12a | The online simulation remote controller structure. Would that be for use on a smartphone? Or properietary touch screen remote?
| |
22:27 | troy_s | http://assets.pro.sony.eu/Web/menu-simulator/index.html
| |
22:30 | Rebelj12a | No no i mean the end product remote
| |
22:31 | Rebelj12a | Android app, ios app, or proprietary remote/open source remote
| |
22:31 | Rebelj12a | Basically, I want to know what kind of hardware interface screen im working with first before making suggestions.
| |
22:32 | troy_s | Rebelj12a: I suspect that is going to be the menu on the side of the camera.
| |
22:32 | troy_s | There was a thing already broken down on the wiki with some elements
| |
22:32 | troy_s | (Like the CC axis (CDL / LUTs per view / eyepiece /etc.)
| |
22:33 | Rebelj12a | ah ok, ill check the wiki. Yes i know how the remote system and simulators work, ive played with the arri
| |
22:33 | Rebelj12a | i.e. pushed shiny buttons
| |
22:34 | Rebelj12a | No no i meant in the progress article sebastion just posted. Theres the Axiom Beta Controller Menu Simulator. The Axiom Beta Controller will it be ios, android, or open source / proprietary.
| |
22:34 | Rebelj12a | If so what hardware.
| |
22:35 | Bertl | most likely it will be PIC32MZ based, with an SPI attached LCD and a bunch of buttons/dials
| |
22:35 | Bertl | think arduino, just microchip instead of atmel
| |
22:36 | Rebelj12a | Ok will the screen support multitouch?
| |
22:36 | Bertl | unlikely
| |
22:36 | Rebelj12a | too bad ok then
| |
22:36 | Bertl | most likely it will not support touch at all, thus the buttons
| |
22:36 | troy_s | Does it need multitouch?
| |
22:37 | troy_s | Thank god.
| |
22:37 | Rebelj12a | oh buttons ok
| |
22:37 | troy_s | Rebelj12a: You ought to be anti-touch too having that blasted BMCC.
| |
22:37 | Rebelj12a | It mentioned remote so
| |
22:37 | Rebelj12a | I am, however, on a remote such as an android app.
| |
22:37 | troy_s | Sure.
| |
22:37 | Rebelj12a | Multitouch is 10x better. especially being that small
| |
22:37 | Rebelj12a | on the camera itself, its the devil
| |
22:37 | Bertl | yes, the android app will use touch of course :)
| |
22:37 | Rebelj12a | Ah ok ^_^
| |
22:38 | Bertl | but it will, similar to the simulator, make it look like buttons :)
| |
22:38 | Bertl | (at least that's what I understood :)
| |
22:39 | Rebelj12a | So, taking design suggestions on it?
| |
22:40 | troy_s | ala http://assets.pro.sony.eu/Web/menu-simulator/index.html
| |
22:41 | troy_s | and http://www.arri.de/camera/alexa/tools/alexa_camera_simulator.html
| |
22:42 | Rebelj12a | my favorite the alexa
| |
22:42 | Rebelj12a | er Arri
| |
22:44 | Rebelj12a | That wheel mmm
| |
22:48 | Rebelj12a | Bertl is there a list of percieved functions that are going to be in the release OS for Beta? Oh wait ill check the wiki
| |
22:49 | Bertl | probably nothing fixed yet, too early
| |
22:50 | Rebelj12a | Ah ok, in designing the menu its helpful to know the general layout and functions that are needed.
| |
22:52 | Bertl | but do not limit yourself by planned features, we will sooner or later get all features which make sense
| |
22:52 | Bertl | (and a few more which don't :)
| |
22:53 | Rebelj12a | Well yes, menu design and feature design go hand in hand with fixed buttons and the like. Ah ill just plan design for future easier that way.
|