23:04 | Sasha_C | joined the channel | |
23:17 | Bertl | welcome Sasha_C!
| |
23:17 | Sasha_C | Good morning
| |
23:17 | Sasha_C | (or evening, if you're in Austria?)
| |
23:18 | Bertl | doesn't matter :) .. how's life?
| |
23:18 | Sasha_C | Good, it's a beautiful Saturday morning here in Sydney
| |
23:19 | Sasha_C | I finally have some time to create a draft blog post on the apertus site. And after I've done that, I think I might head out to a local street festival (only happening this weekend).
| |
23:19 | Sasha_C | How are you?
| |
23:20 | Bertl | nice! I'm fine ... got some time to work on axiom alpha now
| |
23:20 | Bertl | so I'm coding and debugging :)
| |
23:20 | Sasha_C | How are things coming along with alpha development?
| |
23:20 | Sasha_C | Yes, I see
| |
23:20 | Bertl | quite nicely actually ...
| |
23:21 | Sasha_C | Always wonderful to hear that :)
| |
23:21 | Bertl | some minor issues popped up, but nothing we cannot work around or solve
| |
23:22 | Sasha_C | So, do you have an actual copy of the cmosis cmv12000 now attached to the SFE hardware and zedboard?
| |
23:22 | Bertl | copy?
| |
23:23 | Bertl | I've a cmosis sensor here sitting on my desk, plugged into the sensor board, which in turn is attached to the zedboard I'm working on :)
| |
23:23 | Sasha_C | Sorry, not copy, what I meant was an actual sensor rather than a dummy model sensor
| |
23:23 | Sasha_C | Wow! Sounds very cool!
| |
23:23 | Bertl | and I'm chatting with the sensor :)
| |
23:23 | Sasha_C | Great work man!!
| |
23:23 | Bertl | thanks! appreciated!
| |
23:24 | Sasha_C | And is the lens mount also attached? Are you saying the hardware is ready to go and capture footage with once the software side of things has been taken care of?
| |
23:25 | Bertl | no, lens mount and lens is currently with sebastian
| |
23:25 | Sasha_C | But I imagine, when the time is ready, it won't be difficult to attach the lens mount to the SFE?
| |
23:25 | Bertl | and the hardware isn't ready to capture data yet, but I'm receiving data similar to what we will be able to capture soon
| |
23:26 | Bertl | i.e. working on that :)
| |
23:26 | Sasha_C | Sounds sooo cool!
| |
23:26 | Bertl | it's actually quite complicated to get that data from the sensor, which probably isn't that obvious to the public
| |
23:27 | Bertl | there are 70 wires involved, working in pairs, they send signals at 300MHz in both directions
| |
23:28 | Bertl | those signals carry the clock and data to and from the sensor
| |
23:28 | Sasha_C | Yeah, very complicated
| |
23:28 | Bertl | the data itself is broken down into serial streams of 8, 10 or 12 bits
| |
23:28 | Bertl | and for example, you have to tune the delay for each line separately
| |
23:29 | Bertl | to get meaninful data, then you have to align the bitstream on word boundary
| |
23:29 | Bertl | again for each channel separately
| |
23:29 | Sasha_C | Do you think it would be beneficial for me to write about this on the apertus site, explaining the difficulty of getting data from the sensor?
| |
23:29 | Sasha_C | So people know what we (or you) are currently working on?
| |
23:30 | Bertl | well, all this is not really unexpected, i.e. it is how LVDS works, but yes, I guess for the typical movie folks following the axiom development it might be quite interesting
| |
23:31 | Bertl | it's like using a microscope to look at your fingertip :)
| |
23:31 | Sasha_C | interesting analogy
| |
23:31 | Sasha_C | I'll write a draft and send it to the team for review :D
| |
23:31 | Bertl | or if you want the computer analogy:
| |
23:31 | Sasha_C | yess?
| |
23:32 | Bertl | to consider how transistors (switches) form gates, and millions of gates form a processor which then executed assembler code which was generated by a compiler processing a C program
| |
23:32 | Bertl | *executes
| |
23:34 | Sasha_C | Amazing how all this fits onto a small chip/board!
| |
23:34 | Bertl | yes, and we are working at all levels here
| |
23:35 | Bertl | down to the smallest transistor and gate and up to the highest level where we execute shell scripts on the arm cores of the zedboard
| |
23:35 | Bertl | everything needs to work together in perfect harmony
| |
23:36 | Sasha_C | How long do you estimate this could take to work out?
| |
23:36 | Bertl | if everything goes as expected, we should have some 'images' next week
| |
23:37 | Sasha_C | That is FANTASTIC news man!!!
| |
23:37 | Sasha_C | Well done!
| |
23:37 | Bertl | I'll probably get the first real image data this weekend, but it won't be that useful from the photographic point of view
| |
23:37 | Sasha_C | Of course, not until the lens mount is attached
| |
23:38 | Bertl | precisely
| |
23:39 | Sasha_C | Can you describe what all of the hardware looks like at the moment? I'm imagining that there are two separate small-medium sized boards (about the size of an A4 piece of paper) connected by wires to a laptop, externall hdd/ssd and power source.
| |
23:39 | Bertl | you haven't seen the pictures yet?
| |
23:40 | Sasha_C | I've seen pictures of the circuit boards, from OHS park, but not of everything connected together
| |
23:40 | Bertl | well, I guess you saw the ones on the axiom site, yes?
| |
23:40 | Sasha_C | yeah
| |
23:40 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/axiom_alpha_6.png
| |
23:40 | Bertl | so this one is pluggin into the zedboard
| |
23:41 | Bertl | on the right side, directly into the FMC connector
| |
23:41 | Bertl | on the left side, there are two of those pmod_debug modules attached
| |
23:41 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/pmod_assembled_top_600dpi.png
| |
23:41 | Sasha_C | Ah yess, I've seen this.That is amazing! It looks like it has the potential for such a small form-factor
| |
23:42 | Bertl | at the bottom, we have a cable to connect the PMOD on the frontend
| |
23:42 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/axiom_alpha_3.png
| |
23:42 | Sasha_C | This is SOOO cool! Thanks for sharing this with me
| |
23:43 | Bertl | this is custom made - it connects the PMOD at the bottom left with the frontend right across the zedboard
| |
23:43 | Bertl | I've also a rainbow colored cable attached at the second PMOD connector at the bottom, which feeds debug data into my analyzer
| |
23:44 | Sasha_C | Incredible Herbert! You're amazing!
| |
23:44 | Bertl | then the usual cables attached to the zedboard, like power, ethernet, dual micro usb, hdmi
| |
23:44 | Bertl | the zedboard boots via ethernet, so that's important
| |
23:46 | Bertl | on the PC I have quite a number of PDFs open (mostly datasheets and documentation)
| |
23:47 | Bertl | also a number of files describing the pin assignments, the actual VHDL code for the FPGA and a dozen tools to allow to capture and interpret data
| |
23:50 | Sasha_C | This is all so amazing to hear! I'll start writing up an article referring to this now
| |
23:50 | Bertl | hehe :)
| |
23:50 | Bertl | well, let me know if you need more detail on anything in this regard
| |
23:51 | Sasha_C | Will do :)
| |
04:06 | Bertl | night everyone!
| |
04:06 | Sasha_C | night man. sleep well
| |
04:12 | Sasha_C | left the channel | |
05:20 | Sasha_C | joined the channel | |
11:12 | Bertl | morning everyone!
| |
14:56 | Sasha_C | left the channel | |
16:45 | se6astian | joined the channel | |
16:46 | se6astian | evening!
| |
17:51 | Bertl | hey se6astian! did you see the gerber url yesterday?
| |
18:06 | se6astian | nope sorry
| |
18:06 | se6astian | let me dig them up
| |
18:10 | se6astian | got it
| |
18:10 | se6astian | http://vserver.13thfloor.at/Stuff/AXIOM/cmv12k-adapter-v1.1.tar.xz
| |
18:10 | se6astian | thanks, will forward that to sierracircuits
| |
18:12 | se6astian | done
| |
18:25 | se6astian | and sent datasheet links to akhilesch
| |
18:57 | gcolburn | joined the channel | |
18:57 | gcolburn | hello
| |
19:32 | se6astian | hey there
| |
19:32 | se6astian | long time no see
| |
19:32 | se6astian | how are you gabe
| |
19:34 | se6astian | ok, I have to head out for a bit
| |
19:34 | se6astian | please check with Bertl on the latest development news ;)
| |
19:34 | se6astian | see you later maybe
| |
19:37 | gcolburn | sorry
| |
19:38 | gcolburn | must have missed you, yeah I was hoping to catch up with on what I could help out with
| |
20:10 | Bertl | hey gcolburn!
| |
20:14 | gcolburn | hey!
| |
20:14 | gcolburn | how's it going?
| |
20:14 | Bertl | fine, thanks, just finished working on some wood stuff
| |
20:16 | Bertl | so you're eager to start with axiom/alpha development/testing, yes?
| |
20:16 | gcolburn | nice. what do yo umake?
| |
20:17 | Bertl | it was simple board (kind of cupboard) which gets attached to the wall to carry a speaker
| |
20:17 | gcolburn | neat
| |
20:17 | Bertl | helped a friend there
| |
20:18 | gcolburn | eventually when I get around to designing the physical part of the digital camera back I'd like to make I've considered making it out of a nice wood (since many large format cameras are made out of wood), and all the electronics would be enclosed inside
| |
20:18 | Bertl | well, we have a wooden handle planned :_
| |
20:18 | gcolburn | yeah, that's cool
| |
20:19 | Bertl | the thing with wood and electronics is that they do not go that well
| |
20:19 | Bertl | first, wood kind of attrackts humidity, if not treated
| |
20:19 | gcolburn | yeah, I as thinking there would have to be something sealed inside so the electronics aren't in contact with it. It was just an idea. not sure which route I'd go
| |
20:19 | Bertl | second, it is really poor at shielding, compared to most metal solutions
| |
20:20 | Bertl | (both electrical and magnetically)
| |
20:20 | gcolburn | would a camera need much shielding though?
| |
20:20 | Bertl | yes, a lot
| |
20:21 | gcolburn | interesting. are you mainly concerned with noise being introduced to the imager?
| |
20:21 | Bertl | it's all high frequency (300MHz and more) and a lot of data which needs to be transported without any distortion
| |
20:21 | Bertl | so basically the camera is a radio transmitter and receiver
| |
20:22 | Bertl | an involuntary one that is :)
| |
20:22 | gcolburn | now that you put it that way it makes sense (based on the frequency)
| |
20:22 | gcolburn | so what's the status of the prototype?
| |
20:22 | Bertl | it is sitting right beside me, powered up, and I'm talking with the sensor
| |
20:23 | Bertl | receiving test patterns and implementing the training stuff to get reliable LVDS transmissions
| |
20:23 | gcolburn | so mainly using the SPI at the moment?
| |
20:24 | Bertl | spi for control settings and retrieving status information
| |
20:24 | Bertl | 300MHz 12 bit serial over LVDS for the data
| |
20:25 | se6astian | back
| |
20:25 | gcolburn | ok, so these are the test patterns generated by the sensor, not test patterns for the HDMI?
| |
20:25 | Bertl | correct, the LVDS training patterns
| |
20:25 | gcolburn | ok cool
| |
20:25 | gcolburn | I read about that in the datasheet
| |
20:27 | Bertl | so what is your current equippment regarding axiom alpha and what would you like to work on?
| |
20:28 | gcolburn | well, I currently mainly have the ZedBoard
| |
20:28 | gcolburn | at some point whenever you think it would make sense, I'd be interested in purchasing a board and sensor
| |
20:28 | Bertl | okay, makes sense, we will figure out a way to get boards and sensors to developers
| |
20:29 | gcolburn | I don't know what work needs to be done at this point, if its more writing C code/scripts on the linux side or FPGA code
| |
20:29 | Bertl | have you set up a development environment (on Linux?) for your ZedBoard?
| |
20:30 | gcolburn | are you referring to just the normal Xilinx tools (Planahead, ISE, SDK, etc.)? I've been using that in Ubuntu linux.
| |
20:31 | Bertl | okay, we switched to Vivado a few weeks ago, after Aaron (from Xilinx) suggested that we have a look
| |
20:31 | gcolburn | how's it working?
| |
20:32 | Bertl | from the VHDL PoV not much changed, XDCs have replaced UCFs
| |
20:32 | Bertl | and what IMHO is a big advantage, they completely switched to TCL
| |
20:32 | gcolburn | for scripting?
| |
20:32 | Bertl | for everything
| |
20:33 | Bertl | you basically start with a TCL shell where everything happens (or can happen :)
| |
20:33 | Bertl | including GUI stuff
| |
20:33 | gcolburn | okay. so it probably makes a custom build chain a bit easier?
| |
20:33 | Bertl | works pretty well so far, of course, it's the second (or maybe third) release of that product, so it's somewhat buggy
| |
20:34 | gcolburn | well if you guys are switching to completely to that I'll make the jump
| |
20:34 | Bertl | but the interactive TCL shell helps a lot
| |
20:34 | gcolburn | are they still using SDK based on eclipse?
| |
20:35 | Bertl | the SDK didn't change AFAICT, but I haven't had much contact with that yet
| |
20:35 | Bertl | I've built my own kernel and userspace and there is no need for the SDK ATM
| |
20:35 | gcolburn | okay. that's nice
| |
20:35 | gcolburn | I haven't had time to get into building a custom kernel, so I've just done bare metal apps in SDK
| |
20:36 | Bertl | I'm kind of building kernels all the time, so that was easy for me
| |
20:37 | Bertl | btw, you can grab the kernel and userspace and if you like I can tell you how to roll your own as well
| |
20:37 | gcolburn | yeah, many years ago I built some custom kernels for some features in gentoo linux, but its been a while!
| |
20:37 | gcolburn | sure, that would be great
| |
20:37 | gcolburn | so you're mainly using scripts to interface with the sensor and not writing C correct?
| |
20:38 | Bertl | the kernel and initramfs is part of the HDMI test package
| |
20:38 | Bertl | I'm doing both, scripts where speed isn't that important (mostly bash atm) and C where I need higher performance
| |
20:38 | gcolburn | yeah okay
| |
20:39 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/ALPHA/axiom_alpha_hdmi_test_v0.1.zip
| |
20:40 | Bertl | http://vserver.13thfloor.at/Stuff/AXIOM/ALPHA/hdmi_test_v0.1/ (here you can see the vivado.tcl stuff)
| |
20:40 | Bertl | this is basically the TCL script building the PL bitstream
| |
20:40 | gcolburn | ok
| |
20:41 | gcolburn | have you had many testers for the HDMI?
| |
20:41 | Bertl | did you configure TFTP booting yet?
| |
20:41 | gcolburn | yes, I just don't have the latest boot image
| |
20:41 | Bertl | okay, good
| |
20:42 | Bertl | it simplifies testing stuff a lot IMHO
| |
20:42 | gcolburn | i've had a weird problem where when I have the TFTP/DHCP running from ubuntu linux my wifi connection seems to freeze up. I had specified the MAC of the zed board specifically
| |
20:42 | Bertl | regarding HDMI testing, I only know of se6astian actually testing the code, but maybe a few others did so too :)
| |
20:43 | gcolburn | ok, well if I can find a cheap DVI->HDMI converter I can test it
| |
20:43 | Bertl | you might want to simply switch to separate network for the zedboard
| |
20:43 | Bertl | maybe even a separate cable to the zedboard
| |
20:44 | Bertl | (simple on a PC, harder on a laptop)
| |
20:44 | gcolburn | yeah
| |
20:44 | gcolburn | right now I'm using Ubuntu in Virtualbox on my MAC laptop
| |
20:44 | Bertl | but OTOH, there are USB to Ethernet dongles available nowadays
| |
20:44 | Bertl | do you have thunderbolt?
| |
20:45 | Bertl | (the interface)
| |
20:45 | gcolburn | no
| |
20:46 | gcolburn | it just has ethernet, USB, and mini display port (which I use an adapter to DVI)
| |
20:46 | Bertl | okay, so still USB 100MBit ethernet would be an option
| |
20:46 | Bertl | (and maybe more than sufficient for the ZedBoard boot)
| |
20:48 | gcolburn | its been a little while since I've played with the zed board as things have been a bit busy. I was out of town for work for a bit. I'm trying to remember if I was using ethernet on the zed board before
| |
20:49 | gcolburn | i don't think I was, because it would try to get an IP over dhcp from my router, which I couldn't configure for TFTP
| |
20:49 | gcolburn | so I setup Unbuntu to be the dhcp/TFTP server
| |
20:51 | gcolburn | this could work I would think: http://www.amazon.com/DVI-HDMI-Cable-6ft-Male-Male/dp/B0002CZHN6
| |
20:51 | gcolburn | left the channel | |
20:52 | gcolburn | joined the channel | |
20:55 | Bertl | yes, it should work
| |
20:56 | gcolburn | I'll go ahead and order one
| |
20:56 | Bertl | if you bought a higher end graphics card for a PC (or know somebody who did) then it might have an adapter plug as well
| |
20:56 | gcolburn | yeah
| |
20:56 | Bertl | but the cable is definitely a better choice, less noise
| |
20:57 | gcolburn | I've considered purchasing a new desktop to use as a server and have a nice graphics card on it (I've done some GPGPU computing for research when I was at the university)
| |
20:58 | gcolburn | I'll start with the cable though
| |
20:58 | gcolburn | so what are some of the main areas of development you need help with?
| |
20:59 | Bertl | ATM, unless you want to hack on FPGA code, the main area is probably testing stuff and writing small scripts to interpret settings and to configure the sensor or the HDMI part
| |
20:59 | Bertl | also documentation is always welcome (for all areas)
| |
20:59 | gcolburn | okay. how would I test the scripts without a sensor?
| |
21:07 | Bertl | we did a fake busybox devmem
| |
21:07 | Bertl | which basically uses a file to store the data normally written into the sensor registers
| |
21:07 | gcolburn | okay
| |
21:07 | Bertl | most of the registers are read/write and read back the written value anyways
| |
21:08 | Bertl | (notable exceptions are the temperature sensor for example)
| |
21:08 | Bertl | so this should already get you an enviroment which almost behaves like a real sensor (from the SPI PoV)
| |
21:09 | gcolburn | I looked through the data sheet a bit ago and looked at how many of the register settings depending on what modes of operation one wants to use. do you guys have an idea of which modes of operation you'd like to demonstrate on the prototype?
| |
21:09 | Bertl | for the HDMI part, you have everything we have (regarding hardware) so there you are testing on the real thing :)
| |
21:09 | Bertl | we definitely want to play around and test different stuff
| |
21:10 | Bertl | after all, we need to get a feeling for the sensor itself
| |
21:10 | gcolburn | yeah
| |
21:10 | Bertl | we have 32 lanes, 16 on each side
| |
21:10 | gcolburn | looks like it has a lot of capabilities, but that requires a lot of register configurations
| |
21:10 | Bertl | so we can test 16 single side vs 8/8 double side readout for example
| |
21:11 | Bertl | we also want to target highest possible resolution and framerate with 32lanes
| |
21:11 | gcolburn | okay
| |
21:11 | Bertl | and we probably want to test the exposure curves
| |
21:12 | Bertl | I'm not sure that we want to use less than 300MHz, so everything related to lowering the LVDS clock rate is probably not that important
| |
21:12 | Bertl | we definitely want to test the subsampling and windowing
| |
21:13 | gcolburn | would it be useful to create some spreadsheets of register settings for different configurations (for documentation), and then create the corresponding scripts?
| |
21:14 | Bertl | I guess so, probably a smart script which can 'calculate' the settings for given input values/parameters would be a good solution as well
| |
21:14 | gcolburn | yeah
| |
21:15 | Bertl | AFAIK, Thomas is working on some sensor related scripts, but I don't know what the current status is
| |
21:16 | Bertl | se6astian: do we have the set_exposure script somewhere?
| |
21:16 | gcolburn | I saw some of the emails from the other guy that helped with some of the exposure script. some of the calculations are a bit hard to read in the script, for me it would be easier to test the calculations in a spreadsheet and then write the script. other people could also easier check the equations than having to read the polish notation
| |
21:17 | Bertl | I agree that it might be confusing at the first glance, but actually I consider it a rather smart solution, mainly because with the limited environment, dc/bc is a great tool to have
| |
21:18 | Bertl | not that you really need it for those calculations
| |
21:18 | Bertl | what I think would be great to have is some kind of 'info' script
| |
21:19 | gcolburn | okay. what do you have in mind?
| |
21:19 | Bertl | and I already communicated that to Thomas (not sure how well I communicated it though)
| |
21:19 | Bertl | the basic concept is to have something which can tell you exactly what state the sensor is in
| |
21:20 | Bertl | i.e. how many bits are configured, what exposure time is set, if it is auto or external
| |
21:20 | Bertl | how many lanes are used, where they are located, what the image area is, etc
| |
21:20 | gcolburn | yeah
| |
21:21 | Bertl | because when something doesn't work as expected while testing with the prototype, I usually have to dig into those registers to find out why
| |
21:21 | gcolburn | so something like: info -exposure -lanes -imagearea
| |
21:21 | gcolburn | with a mode that just dumps everything?
| |
21:22 | gcolburn | so you could select limited information if you wanted, or all of it
| |
21:22 | Bertl | yes, I guess even cmv_info | grep -i exposure would suffice
| |
21:22 | Bertl | it's unix after all :)
| |
21:22 | gcolburn | yeah that could work
| |
21:25 | gcolburn | well if you'd like I can start working out some of the register settings/configurations for the different main modes and documenting them, and then I could move on to writing some scripts
| |
21:25 | Bertl | on a different page, we will be bursting sensor data into memory buffers (for snapshots and maybe even short recordings)
| |
21:26 | gcolburn | is that to get the 4k?
| |
21:26 | Bertl | the data will arrive in a twelve bit raw format with (yet unknown) alignment
| |
21:26 | Bertl | yes
| |
21:26 | Bertl | and we need some userspace tools to collect that data and write it in some format we can process/view
| |
21:27 | gcolburn | so I don't come from a cinematography background, more photography. is there a defined standard format for 4k?
| |
21:27 | gcolburn | i tried a web search and couldn't find much
| |
21:27 | Bertl | I'm sure there is, but that's not my area either :)
| |
21:27 | gcolburn | ok
| |
21:27 | gcolburn | well for initial still frame testing I can convert it into dng
| |
21:28 | gcolburn | my first code for that was in python (as it was quick to code), but I'm porting it to C
| |
21:28 | gcolburn | for speed
| |
21:28 | gcolburn | if that would be of any use
| |
21:28 | Bertl | sounds good to me, as long as it can handle the raw formats we throw at it
| |
21:28 | gcolburn | oh yeah
| |
21:29 | Bertl | I can already say that it will be similar to the sensor data transfer
| |
21:29 | gcolburn | yeah
| |
21:29 | Bertl | so we get a number of pixels (up to 128) at once
| |
21:29 | Bertl | and they are in the well known bayer pattern
| |
21:29 | gcolburn | yeah, I'm not too worried about that
| |
21:29 | Bertl | we probably will align every line to 4k
| |
21:30 | gcolburn | what I've done is taken raw images from my camera and I wrote a DNG image to read the bayer array, and then be able to write it back out
| |
21:30 | Bertl | and we will burst/store the pixels (regardless of the number) one after the other
| |
21:30 | gcolburn | that should read "I wrote a DNG reader"
| |
21:30 | Bertl | okay, so pleas walk me through the work flow
| |
21:30 | Bertl | *please
| |
21:31 | Bertl | I ssh to the ZedBoard and start your tool 'A' which connects to the memory area (or receives the data from stdin?)
| |
21:31 | Bertl | and outputs the lossless compressed DNG data to stdout?
| |
21:31 | gcolburn | that could be done
| |
21:32 | Bertl | on the linux box, I then can view that image with?
| |
21:32 | gcolburn | there are two options
| |
21:32 | gcolburn | either existing software that views raw files
| |
21:32 | Bertl | like?
| |
21:32 | gcolburn | photoshop, lightroom, etc. (there's a bunch of open source ones too)
| |
21:32 | gcolburn | or another idea
| |
21:33 | gcolburn | we can use a program called dcraw
| |
21:33 | gcolburn | to generate a demosaiqed image
| |
21:33 | se6astian | darktable, raw therapee, gimp, dcraw, ufraw are all open source software solutions
| |
21:33 | gcolburn | yeah
| |
21:33 | gcolburn | http://www.cybercom.net/~dcoffin/dcraw/
| |
21:33 | se6astian | but I am off to bed :)
| |
21:33 | Bertl | have a good one!
| |
21:33 | se6astian | thanks
| |
21:33 | se6astian | good night!
| |
21:33 | gcolburn | see you later!
| |
21:34 | se6astian | left the channel | |
21:34 | gcolburn | dcraw is free to use
| |
21:34 | Bertl | well, yeah, I think I need something which can be used in a pipe style setup (and definitely on linux)
| |
21:34 | Bertl | i.e. something like display from ImageMagic
| |
21:34 | gcolburn | I think dcraw can output to stdout
| |
21:34 | Bertl | and I guess it should be reasonably fast and allow to view the raw data somehow
| |
21:35 | Bertl | i.e. so that I can zoom in somehow and see each pixel
| |
21:35 | gcolburn | yeah
| |
21:35 | gcolburn | would you be transferring a file off the zed board to view it
| |
21:35 | gcolburn | ?
| |
21:36 | Bertl | yes, definitely, via the beforementioned ssh pipe
| |
21:36 | Bertl | i.e. I would ssh into the zedboard executing the image capture tool which outputs the dng
| |
21:36 | gcolburn | ok
| |
21:36 | Bertl | the output on the linux box is then piped into the viewer/converter
| |
21:37 | gcolburn | okay, would this be just for still frames or for video?
| |
21:37 | Bertl | primarily for still images
| |
21:37 | Bertl | for video I guess we want/need to do some processing on the zedboard
| |
21:38 | gcolburn | ok, so it wouldn't be the end of the world if after you run the capture program it writes to a file, and then you ssh the file back to your computer to view?
| |
21:38 | Bertl | and I'm not sure we want to spend too much time on that at the alpha stage
| |
21:39 | Bertl | not the end of the world, but definitely annoying ... although I could simply store the file in memory (if it is small enough) and 'cat' it to get the pipe data I'd prefer
| |
21:39 | gcolburn | ok
| |
21:39 | Bertl | but of course, simply dumping the data to stdout would be a lot simpler
| |
21:39 | gcolburn | yeah
| |
21:40 | Bertl | does the dng writer need to seek?
| |
21:40 | gcolburn | if its not build into dcraw we can modify the code to do that
| |
21:40 | gcolburn | not necessarily
| |
21:40 | Bertl | the dcraw is on the PC side, I do not care about memory or filesystem resources there
| |
21:40 | Bertl | so writing to a file and opening that file is perfectly fine
| |
21:41 | Bertl | (i.e. I was talking about the tool running on the zedboard)
| |
21:41 | gcolburn | okay, I was thinking dcraw would be compiled on the zedboard
| |
21:41 | gcolburn | either way works
| |
21:41 | Bertl | I don't see why we should bother with dcraw on the zedboard
| |
21:41 | gcolburn | ok
| |
21:41 | Bertl | (except for the fun of doing it :)
| |
21:41 | gcolburn | yeah :)
| |
21:42 | Bertl | so I gues you have at least two areas now where you can go crazy the rest of the weekend, yes?
| |
21:42 | gcolburn | regarding the seek
| |
21:42 | gcolburn | to read a dng you definitely have to do a lot of seeks
| |
21:42 | gcolburn | i'll have to see if I'll need to do any for the writing
| |
21:43 | gcolburn | I basically have to create an index with the offsets to different TIFF tags and where the image data is located
| |
21:45 | gcolburn | a DNG file is basically an extension of a TIFF
| |
21:45 | gcolburn | to add support for raw data
| |
21:45 | gcolburn | and any data can be stored anywhere in the file, so you have to start with the index tree that tells you the offsets to any data you want, then to write it you just have to come up with your own file layout and write the offsets to the main index
| |
21:47 | gcolburn | the capture utility can write to STDIO no problem though, since we (or I) would write that.
| |
21:51 | gcolburn | I just checked, dcraw can write to standard output out of the box as well
| |
21:51 | gcolburn | so from there you could view it with ImageMagic
| |
21:53 | Bertl | excellent! sounds like a plan then
| |
21:56 | gcolburn | okay. I'll start working on some code from that
| |
21:56 | gcolburn | what resolution do you plan on capturing at before down sampling to 4k?
| |
21:58 | gcolburn | I guess probably just leave it at full resolution? it seems like 4k isn't completely a defined size
|