Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Modeling Color Spaces With Blender

samzenpus posted about a year ago | from the how-its-done dept.

Open Source 35

Mrs. Grundy writes "When creative professionals want to visualize colors in three dimensions, they often use dedicated and sometimes expensive software. Photographer Mark Meyer shows how, with the help of its Python scripting interface, you can create graphics of color models in Blender. He demonstrates plotting in XYZ, LAB, and xyY space, and also includes the Blender file to show how it's done."

cancel ×

35 comments

Sorry! There are no comments related to the filter you selected.

BEWARE THE VEHICLE !! (-1)

Anonymous Coward | about a year ago | (#43179353)

Hey well I'm the friendly stranger
In the black sedan
Oh won't you hop inside my car?
I got pictures, got candy, I am a lovable man
I'd like to take you to the nearest star
I'm your vehicle baby
I'll take you anywhere you want to go
I'm your vehicle woman
By now I'm sure you know
That I love you (love you)
Need you (need you)
I want to, got to have you child
Great God in heaven you know I looooovvvve you
Well if you want to be a movie star
I got the ticket to Hollywood
Well if you want to stay just like you are
You know I think you really should
I'm your vehicle baby
I'll take you anywhere you want to go
I'm your vehicle woman

By now I'm sure you know
That I love you (love you)
I need you (need you)
I want to - got to have you child
Great God in heaven you know I loooovvvve you
And I'm your vehicle babe

Pretty but why? (3, Interesting)

pipatron (966506) | about a year ago | (#43179439)

Why massage and hack a program like blender when you can use the venerable POV-Ray [povray.org] , open source raytracer since 25 years back, first raytracer in space, etc.

You can already do all of this directly in its scene description language, and you will get exact results instead of interpolated meshes.

POV-Ray (1)

Taco Cowboy (5327) | about a year ago | (#43179477)

I use POV-Ray too, but have to admit that the POV-Ray community is dwindling, with more and more newcomers opting for Blender instead

Re:Pretty but why? (1)

Anonymous Coward | about a year ago | (#43179547)

There doesn't seem to be much massaging or hacking at all. Because of the substantial Python API, it only takes a few lines of code to get blender to do this. The hardest part, I would guess, is that some of the code he's using is not particularly well-documented.

Re:Pretty but why? (0)

Anonymous Coward | about a year ago | (#43179723)

POV-Ray is extremely stale today. There isn't even an OSX version, they still ship a PPC OS9 binary which will no longer even work via emulation. The Windows and Linux code is just as old.

Re:Pretty but why? (2)

wiredlogic (135348) | about a year ago | (#43180017)

Considering that it's a command line tool with no system dependencies building your own binary from source isn't a significant burden.

Re:Pretty but why? (1)

paulkoan (769542) | about a year ago | (#43179905)

Wow, I had lots of fun with pov back in the day, and Vivid before that.

Writing stuff directly in their respective scene language was a breeze too, and so easy to output from another language - so we used C to produce scenes and then leave POV to chug through them for days to produce animations.

Perhaps if Blender could import SDL, and given it can use POV as a renderer, it would make sense to stick with Blender so you only need one main tool.

Re:Pretty but why? (2)

Sussurros (2457406) | about a year ago | (#43179919)

I didn't know POV-Ray was still around. It caused me untold grief in the 90s and it's the main reason why I ended up buying Shade and Poser which were much easier to use. I've also used Photoshop and now use Blender. The very thought of using POV-Ray again gives a cold shiver, it would need to be a whole new beast to be a viable alternative.

Re:Pretty but why? (0)

Anonymous Coward | about a year ago | (#43179991)

Well, you obviously are a pixel wanker and not a hacker. POV is incredible.

Re:Pretty but why? (1)

Anonymous Coward | about a year ago | (#43180013)

Why use POV-Ray? You could just code it all in C, and it would be faster than POV-Ray, and easier to code and understand too, since 43 years back. Oh, and there is a continuous and thriving C coding community. And there are updates to C on at least a quarter year basis (every 3 months a new and updated version!). Coding in C is 100,000,000 times easier than trying to decode the cryptic byzantine mess that is POV-ray.

Re:Pretty but why? (0)

Anonymous Coward | about a year ago | (#43180195)

Why massage and hack a program like blender when you can use the venerable POV-Ray [povray.org] , open source raytracer since 25 years back, first raytracer in space, etc.

You can already do all of this directly in its scene description language, and you will get exact results instead of interpolated meshes.

My target was SVG so I used Inkscape and gradients.

Re:Pretty but why? (0)

Anonymous Coward | about a year ago | (#43186331)

Because 3D is more informative when you can view it in realtime. Povray was good until cheap 3D hardware made realtime 3D rendering possible for home users.

got a favor to ask you... (-1)

Anonymous Coward | about a year ago | (#43179457)

So, I'm getting married to the love of my life in a couple months. But here's the thing: she won't do anal. Won't let me rim her, won't let me stick a finger up her touch hole, won't stick her finger in my shit chute, won't wear a strap-on and fuck me in the ass. I just want to try it once, cross it off my list and be done. So any girls (or guys?) out there that want to make my dreams come true? I'm 5' 4", greek, and my penis is almost 5". Disease free.

Let me know.

Got it, meet me here (-1)

Anonymous Coward | about a year ago | (#43179593)

Meet me in the Caltrans rest area outside of San Leandro, CA. Here is a picture [wikipedia.org] of the toilet stall. Knock four times then I'll show you the rest.

Can I use Perl instead ? (0)

Anonymous Coward | about a year ago | (#43179459)

Not that good with Python

Anyone ever tried the same trick with Perl ?

Re:Can I use Perl instead ? (1)

Anonymous Coward | about a year ago | (#43179829)

You could do it in Perl, but since Blender doesn't have a Perl API, you'd have to write your own modeling, lighting, rendering, and materials routines. Might be easier to join the rest of the world and just learn a little Python.

Re:Can I use Perl instead ? (0)

Anonymous Coward | about a year ago | (#43185135)

Get laid, that's an order.

I wanted this to be cooler. (4, Funny)

unkiereamus (1061340) | about a year ago | (#43179889)

I dunno about the rest of you, but I read the title as 'Modeling Color Spaces with a Blender.'. That was gonna be awesome.

I was hoping I'd get to see a real life version of the mac working cursor...now that I think on it, I believe when I get off work I'll be going to the thrift store to buy a cheap blender.

I'll post the video.

Deuteranomaly power :-) (5, Informative)

Anonymous Coward | about a year ago | (#43180001)

Cool... now I can finally make a proper color diagram for my fellow deuterananomalous trichromats, scaled to maximize (and give proper names) to OUR gamut... including "dellow" ('deuteranomalous yellow' == our equivalent to "Unique Yellow" -- the color we see as having no hint of green or yellow. Some true deuteranopes have proposed calling it 'deen' -- deuteranopic green).

Since 94% of you are probably thinking, "wtf, 'dellow'?!?" right now, deuteranomaly is generally thought to occur when somebody ends up with 'green' cones whose sensitivity peak is closer to red than the stastical norm (with some semi-recent refinements theorizing that SOME 'protanomalous trichromats' might REALLY be atypical outright dichromatic deuteranopes with a mutation that gives us foveal rod cells to compensate and act like a third cone type under the right lighting conditions).

Anyway, for us, the part of the spectrum you call 'yellow' falls into a vast, bland ocean that's just plain 'featureless green' to us, and the color you call 'Home Depot Orange' is blatantly red, but we have a tiny zone sandwiched between them where moving a tiny bit left or right makes a HUGE difference to the color. Colors WE might call 'dellogreen' (greenish dellow) and 'dred' (reddish dellow), togerther with dellow itself between them would all look like kind of the same orange to you, but we could pick them out and name them as easily as you can differentiate between yellow, orange, and red.

Our colorspace and gamut are absolutely compressed and missing a few bits of depth, but it's made worse by the fact that digital cameras, monitors, and everything else samples or reproduces 'green' at the wrong frequency for us. The problem isn't that I need 'more green' to accurately capture and reproduce 'yellow', the problem is that mainstream hardware samples the wrong green, then squanders most of its bits into areas of the spectrum that are useless to us, and totally starves the tiny sliver where they'd do us the must good. We can talk about dellogreen, dellow, and dred, but trying to photograph/video them, then look at them on a typical RGB monitor (vs what you'd probably call a red-yellow-blue monitor, but we'd see as unambiguously red, green, and blue) would make them all look like 'dellow' to us, the way they'd all look orange to you.

I look forward to future pentachromatic imaging sensors with red, dellow, green, lumirod (the sensitivity curve of rod cells), and blue sensors, and tvs that natively do red-dellow-green-blue. Only tetrachromatic women would get the full benefit, but apparently the color I'm calling 'dellow' (deuteranopic yellow) is pretty close to the peak of a tetrachromatic woman's fourth cone, so we'd get a free ride out of the deal and finally get to have tvs that reproduce OUR gamut in its full possible glory.

Re:Deuteranomaly power :-) (0)

Anonymous Coward | about a year ago | (#43180631)

I think you just made the Internet shutup and pay attention.

Re:Deuteranomaly power :-) (2, Interesting)

Anonymous Coward | about a year ago | (#43181917)

Awesome. The amount of misinformation and misunderstanding about anomalous color vision -- even among people who should KNOW better by now -- is staggering. The truth is, the world isn't as neatly RGB as most people believe, not even for chromatypicals (people whose color vision more or less lines up with statistical norms). Read about the history of things like CIE color, and it quickly becomes obvious that they didn't so much come up with a precise definition of colorspace as define the dogma of an RGB religion that most people automatically buy into and assume is a 100% accurate description of how the world works.

What? You think RGB is fine? OK, write a program that displays a black window, and draws 24-bit gradients that run in 512 steps from 000 to f00 to fff for red, and do the same for green (000 to 0f0 to fff), blue (000 to 00f to fff), yellow (000 to ff0 to fff), cyan (000 to 0ff to fff), and magenta (000 to f0f to fff), and tell me they all look smooth and balanced, instead of looking like somebody's drunken crack fantasy. I'll admit that it's worse for me because I'm deuteranomalous, but even for people with NORMAL color vision, it doesn't look the way you'd expect it to look. That's what's wrong with RGB... it's a piss poor compromise for most, and just plain wrong for about 10%.

Take the whole reconciliation of "gamut" with "RGB". There's a HUGE chunk of the spectrum that can't be precisely reproduced with mainstream RGB color, and it's not just because of the difference between pure white and pure black. We capture images with sensors that have a seriously warped view of the universe, then display them on output devices with an even MORE limited ability to accurately reproduce the world. Many of those limits exist because nobody ever questions WHY they're pervasive, and whether we could do better.

Let's take the example of someone with normal trichromatic vision. If you took a camera whose bayer array was laid out to sample light at not just the traditional peak intensities and sensitivity curves of chromatypical red, green, and blue, but ALSO at the deuteranomalous green peak, the protanomalous red peak, and the "lumirod" peak, and made monitors with an additional subpixel color that lined up with deuteranomalous yellow, the resulting image would look enormously better to anomalous deuteranopes and tetrachromatic women, but would ALSO look better to everyone else as well. There would be no need for anomalous protanopes (red-weak) individuals to try turning up the red in a futile attempt to make the monitor's colors look more accurate (something that's as futile as trying to get accurate true-color images out of the Mars Rover Curiosit, for exactly the same reason), because a signal calibrated to the metamers of a tetrachromatic woman would automatically be tuned to the proper me tamers for deuteranomalous & protanomalous individuals as well, and would look either no worse (or slightly better) for everyone else. Well, ok, it wouldn't quite be perfect for protons, but it would still be a hell of a gigantic improvement over the RGB status quo.

Plus, ROYGCB cameras and RYGB displays would give the consumer electronics industry a whole new reason to get everyone to dump everything they own and replace it all, with the perk that since some women would be the major beneficiaries (either because they're tetrachromatic, or because they'd no longer be watching TVs that a deuteranomalous or protanomalous man had to try and tweak to HIS color preferences), it would all have built-in Wife Acceptance Factor.

Re:Deuteranomaly power :-) (0)

Anonymous Coward | about a year ago | (#43181973)

Argh. I'm going to shoot the person who made the goddamn Mac I'm borrowing insidiously autocorrect words like p-r-o-t-a-n into 'proton', and m-e-t-a-m-e-r-s into 'me tamers'... with no obvious way of turning it off. The same thing happened with my Android tablet last night. Can someone PLEASE give us fscking software that DOESN'T insidiously autocorrect everything to the norms of a 10 year old tweeting about the mall? God, I don't know what pisses me off more... bad color reproduction, or autocorrection...

Re:Deuteranomaly power :-) (1)

neminem (561346) | about a year ago | (#43182737)

I don't think anyone would argue that RGB is "how the world works". I think RGB was just an easy model to visualize and to code for, that was good enough for most purposes. (As a colorblind person, I don't like it so much, but as a programmer, I like it tons. :p)

Re:Deuteranomaly power :-) (2, Funny)

Anonymous Coward | about a year ago | (#43180973)

Sounds like too much work. Instead we'll poke out your eyeballs.

Re:Deuteranomaly power :-) (0)

Anonymous Coward | about a year ago | (#43181747)

That's the damndest thing I've read in a while.

Re:Deuteranomaly power :-) (0)

Anonymous Coward | about a year ago | (#43181873)

Dude, my breakfast is cold now. You owe me some french toast.

But really, that is interesting and I'm going to be googling up on some of this later.
I've read up on a bunch of oddities with color interpretation before, virtual colors and other weird things, but I don't seem to have heard anything about this.

Re:Deuteranomaly power :-) (3, Interesting)

Anonymous Coward | about a year ago | (#43184079)

One of the more fascinating ones to read is this one: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2596756/ [nih.gov] ("Protanomaly without darkened red is deuteranopia with rods").

One thing that makes it so hard to talk about deuteranomalous gamut is the fact that the best and richest part of it mostly lies OUTSIDE the gamut of traditional RGB. Imagine if I gave you a yellow laser, then asked you to mix it with light from a monochromatic orange laser until it matched a red laser. It's impossible -- no matter how dim you make the yellow and how bright you make the orange, it will never look "red". That's the problem deuteranomalous individuals have with everything from photos to video... they always look "wrong" compared to real life (the same way as the Mars Rover's pics), and there's no adjustment we can make to compensate for it... even if we manage to flawlessly match one specific shade of orange by adjusting the relative brightness of red and green, the color of everything ELSE is going to be screwed up even worse. Our color definitions don't match everyone else's, and most of OUR colors get rolled off, attenuated, and mangled away when reproduced in CMYK or RGB.

The best way to illustrate the true spectrum in a way that would preserve everybody's detail and help people with "normal" vision understand deuteranomalous gamut would be to build a row of monochromatic lasers whose frequencies ranged from "pure green" (~500nm, I believe) at the left to extremely intense near-infrared (around 850nm) at the right... with extra lasers at smaller intervals around "orange", since that's where OUR bands of different colors are all clumped and smashed together.

Re:Deuteranomaly power :-) (3, Informative)

Anonymous Coward | about a year ago | (#43187369)

Another interesting journal article about rod behavior in dichromatic individuals: http://macboy.uchicago.edu/~eye1/pdf%20files/large%20field%20tri%20josa%201977.pdf [uchicago.edu]

It appears that under specific lighting conditions, most deuteranopes CAN reliably distinguish between deep saturated red and pure green, even after controlling for relative brightness. The catch is, the color has to fill their field of vision, and they have to study & contemplate it in relatively dim light. In other words, they can't look at a blinking RGB LED on a dashboard and reliably figure out whether it's blinking red or green, but if they spent 15 minutes sitting in a dim room with black walls, then walked into an equally dim bathroom with vivid red or vivid green walls & floor (with brightness adjusted to a level that should, in theory, make the walls look "dark brown" regardless of whether they're red or green), the likelihood that they'd correctly identify them as red or green is greater than random chance would suggest. The explanation is that large areas of color would involve rods, but small areas would use foveal vision (which normally lacks them). I'm pretty sure this article influenced the other one I cited.

So, despite evidence like this, why does just about everyone take for granted that deuteranopes and protanopes have literally ZERO ability to discern red and green, ever, period, end of story? The partial limiting factor is cultural and cognitive... most people don't question the universe around them and feel compelled to constantly analyze and probe its boundaries. If an authority figure (like a doctor) tells them as a child that they can't tell the difference between red and green, and that everything they see is supposed to be blue, white, or yellow... most of them will take it at face value. Especially when they themselves notice that "red" sometimes looks distinctly different from green in their peripheral vision, tell someone like a doctor, and get smacked down & told they're just imagining things because it's impossible.

It's the colorblind engineers who start pulling out the lasers and studying them to see for themselves where the exact boundaries lie, and spend hours in dark rooms comparing the dim glow from a half-dozen $25 supersized infrared security camera floodlights from China to the dim green light from an Indiglo night light (a somewhat new phenomenon... up until a couple of years ago, there was no such thing as a source of 850nm infrared light that was cold enough to not act like an incandescent light, and bright enough to tickle just about everyone's cones enough to notice... if they were visible light, they'd be *blindingly* bright).

There's also another interesting twist to all of this... I haven't seen it officially documented, but if a rare deuteranope with foveal rods could end up passing for a somewhat odd anomalous trichromat, it seems like there's ALSO the possibility that a slightly less rare chromatypical trichromat who ends up with foveal rods might (emphasis on "might") end up with some degree of tetrachromatic color vision, too... and that most chromatypical trichromats might actually be able to learn to distinguish between "identical" metamers that are subtly different using their peripheral vision in dim lighting. It would be an example of some ability that's always been there, but nobody has ever really thought about, studied, or noticed because it would be so minor relative to their normal color vision that they'd only even become aware of it as a possibility after lots of personal experimentation and self-analysis. Of course, this would mean a tetrachromatic woman with foval rods might be able to experience pentachromatic vision under the same circumstances.

Anyway, fascinating stuff. :-)

Re:Deuteranomaly power :-) (0)

Anonymous Coward | about a year ago | (#43200935)

OK, for anybody still following this... the grand prize: http://vision.psychol.cam.ac.uk/jdmollon/papers/Bostenetal05.pdf [cam.ac.uk]

Quick summary: Deuteranomalous individuals have a gamut that's restricted compared to chromatypicals, but if you just accept it on its own terms & redefine the green used by RGB metameric color so it conforms to deutan "unique green" (shifted towards red) instead, an entire band of unique hues are found that all basically look "orange" to a non-deuteranomalous viewer, but form readily-discernible bands that can be given names and recognized easily as distinct hues by deuteranomalous viewers.

Sort of a flawed premise in the summary... (4, Interesting)

forkazoo (138186) | about a year ago | (#43180333)

Just for the record, no creative professionals use dedicated and expensive tools to visualize color spaces. If they use an expensive tool like Maya for it, it's because they happen to have it handy for more sensible purposes. Visualizing color spaces is really just a novelty for most people. Anybody who needs to do it regularly isn't so much a "creative" professional, as a color scientist.

Still, sort of a neat demo of the Blender Python API.

Re:Sort of a flawed premise in the summary... (0)

Anonymous Coward | about a year ago | (#43181409)

Colorthink pro is in common use among photographers, especially those who do there own printing.

Re:Sort of a flawed premise in the summary... (1)

robthebloke (1308483) | about a year ago | (#43181989)

Just for the record, plenty of people do use expensive tools to visualise colour spaces. I work with a small army of visual FX compositors and post production artists who use various tools to do exactly that. Some are off the shelf (e.g. Shake, Nuke), and quite a few are developed internally.

Re:Sort of a flawed premise in the summary... (2)

luckymutt (996573) | about a year ago | (#43183603)

I wouldn't call Shake or Nuke "dedicated" to visualizing color space. They do a tad bit more than that.

Re:Sort of a flawed premise in the summary... (0)

Anonymous Coward | about a year ago | (#43183535)

Plenty of professionals in the printing industry use visualized color spaces to compare input and output ICC color profiles. There is a great tool to do this already - ColorThink Pro from www.chromix.com . It is not free, but at $399 it is a must have for print professionals.

But will the color models... (1)

Anonymous Coward | about a year ago | (#43181665)

blend?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>