See discussions, st ats, and author pr ofiles f or this public ation at : https:www .researchgate.ne tpublic ation311003821 [626427]

See discussions, st ats, and author pr ofiles f or this public ation at : https://www .researchgate.ne t/public ation/311003821
Backgrou nd lu minance and subtense affects color appearance
Article    in  Color R esearch & Applic ation · No vember 2016
DOI: 10.1002/ col.22108
CITATIONS
4READS
82
3 author s:
Some o f the author s of this public ation ar e also w orking on these r elat ed pr ojects:
Colour emotion and colour harmony View pr oject
Imag e quality e valuation View pr oject
Pei-Li Sun
National T aiwan Univ ersity of Scienc e and T echnolog y
77 PUBLICA TIONS    293 CITATIONS    
SEE PROFILE
Hung-Chung Li
Academia Sinic a
17 PUBLICA TIONS    11 CITATIONS    
SEE PROFILE
Ming R onnier L uo
Zhejiang Univ ersity
748 PUBLICA TIONS    7,780 CITATIONS    
SEE PROFILE
All c ontent f ollo wing this p age was uplo aded b y Hung-Chung Li on 18 July 2019.
The user has r equest ed enhanc ement of the do wnlo aded file.

Background Luminance and Subtense
Affects Color Appearance
Pei-Li Sun,1* Hung-Chung Li,2M. Ronnier Luo1,3
1Graduate Institute of Color and Illumination Technology, National Taiwan University of Science and Technology, Taipei, Taiwan
2Graduate Institute of Applied Science and Technology, National Taiwan University of Science and Technology, Taipei, Taiwan
3School of Design, University of Leeds, Leeds, North Yorkshire LS2 9JT, United Kingdom
Received 9 June 2016; revi sed 5 November 2016; accepted 6 November 2016
Abstract: Two psychophysical experiments were con-
ducted to investigate the color appearance under non-
uniform surround conditions with variation of stimulus
luminance, surround luminance, background luminance,background orientation, and background size. The results
show that the background size and surround luminance
influence the appearance intensively, but that the orienta-tion of background pattern has little effect. A method to
determine optimal parameters for the CIECAM02 color
appearance model in lighting applications is proposed.An UGR-based model also is optimized for brightness
estimation. The luminance of adapting field can be esti-
mated by Gaussian-like functions.
VC2016 Wiley Periodicals,
Inc. Col Res Appl, 42, 440–449, 2017; Published Online 27 November
2016 in Wiley Online Library (wileyonlinelibrary.com). DOI 10.1002/
col.22108
Key words: color appearance model; lighting appear-
ance; CIECAM02; unrelated color; brightness estimation
INTRODUCTION
In the past few years, color appearance has been inten-sively studied for better understanding of human visual
perception under a range of viewing conditions.
1The CIE
recommended color appearance model, CIECAM02, has
been successfully used in cross-media color reproduc-
tion.2However, it has not yet been applied to the rapidly-
growing illumination-related industries. One reason is thatthe CIECAM02 model needs tristimulus values of the ref-
erence white and the background luminance ratio in order
to calculate appearance coordinates. This information canbe easily measured from displays and hardcopies, but it is
not practicable to determine these essential parameters in
lighting applications, since the luminance distribution of
direct illuminations and their indirect reflections are nor-mally non-uniform.
3In addition, only simple surround
conditions such as dark, dim and bright were tested in
previous color appearance studies. In lighting applica-
tions, the surround condition is normally complex, andmight mix very bright light sources and very dark shad-ows. Especially for nighttime observations, light sources
are visually unrelated to their background.
4The question
of how to account for the impact of a non-uniform sur-round needs further study.
Brightness perception is greatly affected by surround
conditions. A 300 lux spotlight in a dark viewing condi-tion appears brighter than a 7000 lux cloudy sky. Adepartment store uses many light fixtures to enhance a
store’s atmosphere. If the spatial arrangement of light fix-
tures is not appropriate, some of them might causeunpleasant viewing glare. The perceived brightness cannot
be directly estimated by the vertical or horizontal illumi-
nance of the light sources. Scene luminance can be mea-sured by a 2D colorimeter or a calibrated HDR camera.
5,6
However, the XYZ tristimulus values measured by the
devices are not directly linked to visual perception. The
mechanism of light adaption must be taken into account.7
If the luminance contrast of different parts of the scene is
great, the viewer will experience significant after-image
effects. All these factors will make the brightness estima-tion a challenging task.
Luminance contrast is an import parameter in predict-
ing visual brightness. There are many classic functions
that are widely used for contrast estimation, such asWeber contrast,
8Michelson contrast,8Whittle Contrast,9
Barten contrast sensitivity function,10and Peli’s image
contrast.11In the field of glare measurement, background*Correspondence to : P.-L. Sun (e-mail: plsun@mail.ntust.edu.tw)
VC2016 Wiley Periodicals, Inc.
440 COLOR research and application

luminance is the most important parameter in CIE Unified
Glare Rating.12However, how to determine the back-
ground luminance value of a complex scene is still an
unanswered question. The visual impact might be related
to retina resolution. Zhang and Wandell13proposed a s-
CIELAB metric, which calculates visual image differ-
ences by simulating visual blur of different color chan-
nels. Kuang et al .14applied a low-pass Gaussian filter to
scene luminance image in their iCAM06 model to esti-
mate local light adaption of the scene. The size of the
Gaussian filter was partly adapted to Yamaguchi’sresults
15that the variance of the Gaussian filter should
align with a 5 8angular subtense of the scene. Nakamura16
used a multiresolution luminance image to estimate the
brightness of a scene. This method begins with a 2D
scene luminance measurement, followed by an 11-level
2D wavelet decomposition. Different weights are thenapplied to the wavelet coefficients of different spatial fre-
quency. The weights are band pass functions of the spa-
tial frequency. The scene brightness is predicted byinversion of the weighted wavelet coefficients. However,
this approach does not take into account some important
visual phenomena such as the Stevens effect,
1Hunt
effect,17and Helmholtz–Kohlrausch (H–K) effect.18
CIE Technical Committee 8-01 proposed the color
appearance model, CIECAM02 in 2002. It takes lumi-nance adaptation, chromatic adaptation, cone response
functions, simultaneous contrast, surround effect,
discounting-the-illuminant, Stevens and Hunt effects intoaccount,
1and can predict both relative and absolute color
appearance attributes. The relative attributes, such as
lightness, chroma, and hue, are commonly used for colormanagement of imaging devices. On the other hand, the
absolute attributes, such as brightness, colorfulness, and
hue, can be used for estimating the color gamut of self-luminous imaging devices.
19However, the methods to
determine input parameters for the model have not yet
been well studied.20
Another reason CIECAM022has not yet been applied
to the rapidly growing illumination-related industries is
that many lighting applications, such as the estimation ofthe visibility of traffic signs in night time, are within the
mesopic range, which is not modeled by CIECAM02. As
LED lighting applications becomes widespread, moreresearch is focused on mesopic color appearance model-
ing.
21In terms of research method, Shin et al . used a
haploscopic color-matching technique,22,23Eloholma
applied a visual task performance method,24and Bod-
rogi25used a direct magnitude estimation method with
binocular viewing technique. Fu et al.26investigated mes-
opic visual effects by changing luminance level and stim-
ulus size. The results show that the size of the central
stimulus would affect brightness and colorfulness, whichwould decrease in proportion to the luminance level. Two
color appearance models, CAM97u and CIECAM02,
were used to predict the color attributes in the study. Thevisual results showed that CAM97u was better in bright-
ness prediction whereas CIECAM02 performed better incolorfulness prediction. A model based on CIECAM02,
named CAM02u, was also proposed specifically for pre-dicting unrelated color appearance under photopic andmesopic lighting conditions. Koo and Kwak
27further
compared CAM97u and CAM02u psychophysically usingunrelated colors. The results suggested both models per-form reasonably well and CAM02u could achieve the bet-ter results.
Withouck and Smet
28investigated the performance of
six existing models for predicting the brightness of unre-lated stimuli. Their results indicated that none of themodels performed acceptably, with an underestimation ofthe H–K effect. To take the H–K effect into account, theCAM97u model was modified with an increase in the col-orfulness weighting factor in the model CAM97u,m,which outperformed the other models in their visualexperiment. A comprehensive model for unrelated self-luminous stimuli, CAM15u,
29was derived recently based
on large-scale magnitude estimation experiments. Com-pared with the present models, CAM15u model couldpredict brightness, hue, colorfulness, saturation and theamount of white by adopting CIE 2006 cone fundamen-tals with a simplified calculation process, but was restrict-ed to a 10 8FOV (field of view). Due to the size effects
of unrelated self-luminous stimuli on brightness, the
CAM15u model was modified as CAM15us
30with a log-
arithmic function of the FOV.
The present study includes two major parts: (1) bright-
ness estimation using gray stimuli, (2) brightness and col-orfulness estimation using color stimuli. Different sizesand background orientations were tested to see whetherthe visual results could be successfully predicted by usinga weighted luminance of the viewing field as a keyparameter. The experimental setup, and the results of thetwo studies are introduced below.
BRIGHTNESS OF GRAY STIMULI
Experimental Setup
A total of 120 test patterns (3 gray stimuli 32 sur-
round luminance 32 background luminance 33 back-
ground orientation 34 background size, excluding the
Fig. 1. An example of the viewing field. Observers had to
evaluate only the lighting appearance of the middle patch(Stimulus) subtending 4 8and viewed from 40 cm.
Volume 42, Number 4, August 2017 441

condition as gray-scale value of stimulus equal to the
background) were assessed in a brightness estimation
experiment. Each test pattern was presented for 15 sec-
onds. Figure 1 illustrates an example of the viewing fieldseen in a completely dark environment. The luminance ofthe three gray stimuli was 19, 88, and 227 cd/m
2for
grayscale equivalent to 8-bit RGB values of 64, 128, and192. Note that the definition of surround and backgroundare different from those in CIECAM02. We use the twoterms here solely to denote different areas of the testpatterns.
All test patterns were displayed on a 65 inch LCD dis-
play (HD-65NC1, HERAN) with a resolution of 3840 3
2160 pixels. Its white point and black point luminanceswere 478 and 0.09 cd/m
2respectively, the white point
(u0,v0) chromaticity was (0.185, 0.435) which is close to
D93 illuminant and the gamma was 2.38 measured by aCA-310 color analyzer. The viewing angle of the circular
stimulus was about 4 8at the subject’s position. Table I
shows the parameters setting in the experiment. Examplesof different background orientation and sizes are shownin Figs. 2 and 3, respectively.
Experimental Procedures
A psychophysical experiment was conducted using the
Magnitude Estimation method
31to obtain visual data
from observers. Observers assessed the visual brightnessof the central stimulus from about 40 cm with a headmount, as illustrated in Fig. 4. The resolution of the UHD
TV was high enough that the observers were unable tovisually resolve the RGB sub-pixels of the screen. Whitereflectors (FOMEX-D600) were placed outside the screenboundary to expand the lights near the borders of theLCD and cover the whole viewing field.
For unrelated colors, no reference white was displayed
in the viewing field and therefore only absolute perceptu-al attributes such as brightness, colorfulness and huecould be estimated directly. A 8 312 inches white sam-
ple (GretagMacbeth White Balance Card in 89.2% reflec-tance) was selected as a reference stimulus, and thebrightness was assigned to be 100 for scaling. The refer-
ence sample was illuminated by D50 simulators, where
the luminance of the central area was 485 cd/m
2. Observ-
ers were required to memorize this reference stimulusviewed in a VeriVide viewing cabinet before conductingthe experiment. During the experiment, each observerwas seated in front of the LCD display in a completelydark environment. An experimental assistant was seatednext to the blackout system to control the test patterns,the time of adaption and the recording of the assessment.Twelve observers with normal vision, including six malesand six females, participated in the brightness assessment.The ages of the observers were from 21 to 27 years old,and all of them are Taiwanese. Before starting the experi-ments, a 20-minute dark adaptation was required. Observ-ers were asked to scale the visual brightness of thestimulus by assessing a number that is proportional to themagnitude of the reference without limited range. Eachobserver assessed every pattern once.
Observer Variability
The coefficient of variation (CV) shown in Eqs. (1)
and (2) was adopted in the study to estimate the inter-observer and intra-observer variability. For a better agree-ment between two sets of data, the CV value should beas small as possible. The larger CV value, the poorer theagreement represented.
Fig. 2. Illustration of different background orientations (left: Horizontal; middle: Vertical; right: 16:9 Square).TABLE I. Parameters of the brightness experiment.
FactorNo. of
levelsGray level
(8-bit)Luminance
(cd/m2)
Stimulus 3 64, 128, 192 19, 88, 227
Background 3 0, 128, 255 0.09, 88, 478
Surround 2 0, 255 0.09, 478Orientation of the
background3 H: horizontal rectangle
V: vertical rectangle
S: 16:9 square
Size of the
background4 0%, 12.5%, 50%, 100%
Fig. 3. Four different background sizes (left to right: 0%, 12.5%, 50%, 100%).
442 COLOR research and application

CV5100ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
1
NX ðDE2fDVȚ2
DE/C0/C1 2vuut
Scaling factor f5X
DE/C1DVX
DVðȚ2(1)
CV5100
yffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
X ðxi2yiȚ2
Ns
(2)
For the number of evaluated stimuli N, where DVis the
brightness of a stimulus seen by an individual observer, D
Eis the geometric mean of DVfrom all observers, DEis
the arithmetic mean of all stimuli, frepresents the scaling
factor for data adjustment between DVandDE. When
using Eq. (2) to investigate observer repeatability, set xi
is the first judgment from a single observer, yiis the sec-
ond judgment and yis the arithmetic mean of set yi.
Table II show the mean results of the inter-observer andintra-observer variability in each experiment. The inter-
observer variability for brightness estimation was accept-
able at 18%, but was much higher in colorfulness assess-ment to be intruded later where it was 26%. Theagreements are significantly better than the results of
48%, 32% from Fu et al .
25and 40%, 27% from Koo and
Kwak,26especially in terms of brightness estimation.
Intra-observer variability showed similar results, at 11%,
19%, and 25% for the brightness of gray stimuli, color
stimuli and colorfulness respectively. The CV value ofobserver repeatability could be compared with 22% for
brightness and 32% for colorfulness reported by Fuet al.
25
Visual Brightness
Visual brightness was obtained by discarding extreme
data points and averaging the rest of the observerresponses, and the results are shown in Fig. 5. In this fig-ure, L
tis the luminance of the circular stimulus, and H/V/
S represents Horizontal/Vertical/Squared (16:9) imagebackground, respectively. The figure shows the followingtrends:
/C15The primary factors influencing visual brightness were
background luminance ( L
bg), surround luminance ( Lsr),
and background size (%).
/C15The orientation of background (H, V, or S) had little
effect on visual brightness.
/C15The visual brightness increased significantly under the
dark surround (no background) condition. A 12.5%bright background could reduce the visual brightnesssharply. A 100% bright background would furtherdegrade the visual brightness to some degree. But theeffect was not significant when the luminance of the teststimulus was relatively high.
/C15The background size effect was much stronger for the
dim test stimulus ( L
t519) compared with the bright
stimulus ( Lt5277).
/C15In terms of bright surround ( Lsr5478) and dark back-
ground ( Lbg50) conditions, the visual brightness
increased gradually by expanding the background size.
However, if there is no background in the viewing field,the bright surround will result in the dim test stimulus(L
t519) being perceived as black. Its visual brightness
could be heightened dramatically by using a small size(12.5%) dark background ( L
bg50).
Fig. 4. Viewing environment.TABLE II. Mean results of inter-observer and intra-
observer variability of two experiments.
Coefficient of
variance (CV)Brightness
(gray stimuli)Brightness
(color stimuli)Colorfulness
(color stimuli)
Inter-observer 18% 18% 26%
Intra-observer 11% 19% 25%
Fig. 5. Visual brightness under different stimulus luminance ( Lt), background size, gray level of surround luminance ( Lsr)
and background luminance ( Lbg), and background orientation (H/V/S).
Volume 42, Number 4, August 2017 443

Viewing Parameters for CIECAM02 Brightness Q
Following CIECAM02 guidelines, if we na €ıvely regard
1/5 display luminance as LAand the display white point
as reference white, the coefficient of determination r2
between the visual brightness and CIECAM02 Qwas
0.615 (referring to Fig. 6-Case 1). The prediction of mod-el CAM97u, CAM02u,
25and CAM15u28were also inves-
tigated where the r2were 0.706, 0.846, and 0.757
(referring to Fig. 6-Case 2, -Case 3, and -Case 4).
The optimal way to enhance the prediction is to take
the following steps: (1) Use a 2D Gaussian mask (its cen-ter map to the central circle stimulus) to calculate theadapted luminance ( L
s) and its chromaticity coordinates
(xs,ys) from the scene. The sigma (standard deviation) of
the 2D Gaussian is equivalent to 13 visual degrees (refer-ring to Fig. 7). (2) Calculate the maximum XYZ value ofcenter stimulus [i.e., max(X,Y,Z), denoted as L
t.max]. (3)
Use the maximum value of Lt.max andLsas the luminance
of reference white ( Lw). (4) Take 100 * ( Ls/Lw)a s Yb, (5)
IfYb<4, then set Yb54. (6) Determine the reference
white’s chromaticity coordinates ( xw,yw) by Eqs. (3) and
(4), where ( xE,yE) represent the chromaticity coordinates
of CIE illuminant E (i.e., xE5yE51/3). (7) Scale the
XYZ of white point and center stimulus to fulfill
Yw5100, (8) LA50.5(Lw/51Ls), (8) Use “Average” as
surround condition. Step 1–3 aims to estimate theluminance of a virtual reference white no matter the sur-
round is darker or brighter than the center stimulus. Step5 corrects a CIECAM02 problem which overestimates
brightness Qwhen the Y
bis extremely low. Step 6 esti-
mate the chromaticity coordinates of the virtual whitepoint. Based on Fu’s recommendation,
25CIE illuminant E
is applied for a bright stimulus viewed in a (unrelatedcolor) dark condition. However, when the adapting fieldis much brighter than the stimulus, the color of the virtualwhite point should be in accordance with the adaptingfield. As shown in Fig. 8, the r
2can be increased to
0.964 by taking the above steps. However, CIECAM02still has an issue that the Qvalues of dark colors are too
high, since they should be close to zero. The brightnessprediction in CIECAM02 should be revised, especiallyfor dark colors.
x
w512w ðȚ /C1 xE1w/C1xswhere w5Ls
Lt:max1Ls(3)
yw512w ðȚ /C1 yE1w/C1yswhere w5Ls
Lt:max1Ls(4)
Predict Brightness by an UGR-Based Model
The CIE Unified Glare Rating (UGR), which has been
widely used in the illumination industry, is a measure ofthe glare in a given environment.
11It is basically the
Fig. 6. Visual brightness under model prediction – Case 1: na €ıve CIECAM02 Q, Case 2: CAM97u Q, Case 3: CAM02u Q,
and Case 4: CAM15us Q. [Color figure can be viewed at wileyonlinelibrary.com]
444 COLOR research and application

logarithm of the glare of all visible lamps, divided by the
background illumination. Based on the structure of CIEUGR, we designed a UGR-based model to predict thebrightness of the center stimuli in the experiment. Themost important parameter is the luminance of adaptingfield ( L
A). We found that using Eq. (5), which uses the
weighted sum of different sizes of Gaussian weightedscene luminance ( L
small,Lmid, and Llarge) in the LAcalcu-
lation, achieved the best results. In Eq. (5), the sigmaparameters of the Gaussian masks are equivalent to 3, 18,and 60 visual degrees for L
small,Lmid, and Llargeas shown
in Fig. 9, and the weight are 0.7, 0.2, and 0.1, respective-ly. The study took the influence of aspect ratio of themask into consideration. The weights are close to thereciprocal of Guth position index used in CIE Unified
Glare Rating.
11Optimal results were achieved by apply-
ing 1:0.8 (horizontal to vertical) elliptical Gaussianmasks. The resulting brightness prediction is shown inFig. 10 where the r
2value is 0.972.
B514:5 log L2
t=LA/C2/C3
12:2/C8/C9
where LA50:7Lsmall10:2Lmid10:1Llarge(5)
BRIGHTNESS AND COLORFULNESS
OF COLOR STIMULI
The results of the above experiment showed that the ori-
entation of background pattern had little influence onbrightness perception, and the next experiment was to testthe effect of background luminance, surround luminance
and background size on color appearance.
Test Patterns
6 color stimuli were used (R, G, B with 64 and 192 8-
bit gray levels of the LCD TV). Figure 11 shows thestimuli in CIE 1976 u’,v’-chromaticity space and equiva-
lent luminance. Together with 2 surround luminance 32
background luminance 34 background size, totally 96
test patterns were generated in this experiment. The view-ing condition was identical to the above experiment,except the magenta patch of an X-rite ColorCheckerChart was used as a reference of colorfulness which wasassigned as 50 for scaling. Twelve observers were askedto scale brightness and colorfulness of the 96 colorstimuli.
Results
The brightness estimation of the 96 color stimuli
agreed with the results of the first experiment. The rela-tionship between visual colorfulness and the differentviewing parameters are partly shown in Figs. 12 and 13.
The figures clearly show that when the luminance of
the stimulus is low (e.g., an 8-bit RGB level of 64), back-ground and surround will influence its visual colorfulness.
Fig. 7. The 2D Gaussian mask for adapted luminance
calculation.
Fig. 8. Visual brightness under model prediction Case 5:
parameter optimal CIECAM02 Q.
Fig. 9. Illustration of different sizes of Gaussian masks (left to right: Lsmall,Lmid,Llarge).
Volume 42, Number 4, August 2017 445

If the background is bright and large enough, colorfulness
will decrease sharply. In contrast, if the surround is brightand occupying large space, the colorfulness also willdecrease. However, the impact of the surround is not asstrong as the background.
Modeling
Optimal CIECAM02 Brightness Q
We combined the data of the two experiments for param-
eter optimization. The results show that the method intro-duced in the previous section can be used for both colorand gray stimuli. The coefficient of determination r
2
between the visual data and the prediction can be
improved from 0.613 (na €ıve) to 0.935 (optimal Q C), as
shown in Fig. 14.
Figure 14-left is the model prediction of CIECAM02 Q
which na €ıvely regards 1/5 display luminance as LAand
the display white point as reference white. The results arenot acceptable. After applying the previously mentioned8-step process to determine the optimal parameters and
using a slope and intercept to fit the visual brightness, ther
2increases from 0.613 to 0.706 as shown as Fig. 14-mid-
dle. In the figure, most of color stimuli are below the 45 8
line. It might be caused by the Helmholtz–Kohlrauscheffect that visual brightness of saturate colors are slightlyhigher than neutral colors when their luminance are the
same.
32It was not taking into account by CIECAM02. To
optimize the fitting a four terms polynomial function isapply as Eq. (6), it includes nonlinearity of the UGR-based brightness Q, optimal CIECAM02 chroma Cand
their interaction. The coefficient c
sare 0.186, 0.837,
25.0e-3, and 28.45 in order. The coefficients indicate
that the optimal brightness QCis increase when Cis high
andQis low. Figure 14-right shows the results of fitting
where the r2reaches 0.935. Note that all regression coef-
ficients were derived by a half of test data (samples) andtested using the other half of the data. The resulted r
2are
very similar to the use of whole data for deriving and
testing.
QC5c1/C1Q1:21c2/C1C1c3/C1Q/C1C1c4 (6)
Using the UGR-based Model
Using the two sets of experimental data, we found ellipti-cal Gaussian can enhance its brightness prediction. The
Fig. 10. Visual brightness under model prediction—Case
6: an optimal UGR-based model.
Fig. 11. The CIE 1976 u0,v0chromaticity coordinates of the color stimuli.
Fig. 12. Visual colorfulness versus background size, sur-
round luminance ( Lsr), and background luminance ( Lbg)
when the stimulus is RGB 5(64,0,0), (0,64,0), and (0,0,64).
446 COLOR research and application

results suggest that the luminance of adapting field should
be calculated by a central weighted mean of the lumi-nance of whole viewing field. Taking account of Helm-
holtz–Kohlrausch effect, the brightness prediction Bof
UGR-based model was optimized with a polynomial func-tion, shown in as Eq. (7), it includes nonlinearity of theUGR-based brightness B, optimal CIECAM02 chroma C
and their interaction. The coefficient c
sare 0.494,
24.15e-4, 20.347, 7.39e-3, 23.69, and 14.6 in order.
The resulting prediction is shown in Fig. 15. The coeffi-cient of determination r
2between the mean visual bright-
ness and model prediction is shown in Table III wherethe UGR-based model performs better in comparison to
existing models. To predict brightness for a non-uniform
scene, the luminous information of the adapting field isthe main factor in the calculation. However, without thedefinition of the adapting field, CAM15us model resulted
in a relatively poorer correlation.
B
C5c1/C1B1c2/C1B21c3/C1C1c4/C1C21c5/C1B/C1C1c6(7)
Optimal CIECAM02 Colorfulness M.By applying the
na€ıve parameters for CIECAM02 M, the prediction of
visual colorfulness is poor ( r20.422). It can be improved
to 0.588 by determining the parameters using the eightstep process introduced in the previous section (referringto Fig. 16-left). Slightly better results can be achieved byusing chroma C(Fig. 16-middle) instead of colorfulness
Min predicting the visual colorfulness. The optimized
results were obtained using the method introduced in theprevious section for parameter setting first, then usingbrightness Q, luminance of adapting field L
Aand
Fig. 13. Visual colorfulness versus background size, sur-
round luminance ( Lsr), and background luminance ( Lbg)
when the stimulus is RGB 5(192,0,0), (0,192,0), and
(0,0,192).
Fig. 14. Visual brightness of color and gray stimuli under model prediction—left: na €ıve CIECAM02 Q, middle: optimal
CAM02 Q, and right: optimal CAM02 QC.
Fig. 15. Visual brightness of color and gray stimuli pre-
dicted by an UGR-based model.
TABLE III. Coefficient of determination r2between visual brightness and model prediction.
r2CIECAM02 CAM02u CAM97u,m CAM15usOptimal
CIECAM02UGR-based
Model
Gray and color stimuli 0.608 0.774 0.504 0.722 0.905 0.954
Volume 42, Number 4, August 2017 447

background luminance factor Ybas parameters to fine
tune the prediction (Fig. 16-right). The optimal prediction
MCis done by a polynomial function shown in as Eq. (8).
The coefficient csare 0.841, 22.80e-05, 7.25e-2, 29.57e-
07,21.95, 5.67e-4, and 24.20 in order. A post-process
shown in as Eq. (9) is used to improve the prediction for
low chroma stimuli. The r2between the visual colorful-
ness and the prediction can be improved from 0.422
(na€ıve) to 0.882 (optimized). It was that found the inter-
observer agreement in colorfulness estimation is poorer
than for other appearance attributes, and there remains
room for improvement. Table IV shows the coefficient of
determination r2between visual colorfulness and model
prediction. As a result, none of the present color appear-ance models performed satisfactorily for colorfulness esti-
mation in complex surround condition.
MC5c1/C1C1c2/C1C31c3/C1Q1c4/C1Q31c5/C1LA0:31c6/C1Yb/C1C1c7
(8)ifMC/C2030 then MC5C
50/C1MC (9)
Effect of Stimulus Size
The result reported by Fu et al .25indicated that the
size of circular stimuli affected the perception of bright-ness and colorfulness, and increased with the stimulussize. To consider the influence of stimulus size undernon-uniform surround, two sizes (4 8and 10 8angular sub-
tense) of gray stimuli were investigated. Twelve observersperformed brightness estimation for a total of 40 test pat-terns including 3 gray stimuli, 2 surround luminances, 2background luminances, and 4 sizes of background. TheCV value of interobserver variability was 18% and thevisual data was obtained by discarding data with pooragreement and averaging the remaining evaluations fromobservers. As shown in Fig. 17, the primary factorinfluencing the brightness of the stimuli were again
Fig. 16. Visual colorfulness under model prediction—left: CIECAM02 M, middle: CIECAM02 C, and right: optimal
CIECAM02 M C.
TABLE IV. Coefficient of determination r2between visual colorfulness and model prediction.
r2CIECAM02 M CAM02u CAM97u,m CAM15us Optimal CIECAM02 Mc
Color stimuli 0.422 0.523 0.533 0.604 0.882
Fig. 17. Visual brightness with different size ( Va) of grayscale stimuli under non-uniform surround.
448 COLOR research and application

background size, background luminance and surround
luminance, and the contribution of stimulus size (4 8and
108angular subtense) was small. The reason that Fu’s
study showed significant impact on stimuli size is that the
range of angular subtense they tested was much wider(from 0.5 8to 60 8). If we just look at the 4 8to 10 8angular
subtense range, the impact of stimulus size on brightness
is relatively smaller than the other variables.
CONCLUSIONS
Psychophysical experiments were conducted to investigatecolor appearance under non-uniform surround conditions.
The results show that the orientation of background pat-
tern and the size of stimuli (4 8or 10 8angular subtense)
are not that important. However, background size, back-
ground luminance, and surround luminance influence the
appearance considerably. Through the experiments, it wasclearly found that the Helmholtz–Kohlrausch effect con-
tributed to perceived brightness, especially for low lumi-
nous targets under non-uniform surrounds.
A method to determine optimal parameters for the
CIECAM02 color appearance model in lighting applica-
tions is recommended. A UGR-based model was alsooptimized for brightness estimation. It was found that
the luminance of the adapting field could be estimated
by Gaussian-like functions. This is an initial step ofdeveloping a light appearance model for illumination
applications.
1. Fairchild MD. Color Appearance Models, 3rd edition. New York:
Wiley; 2013. p 289–310.
2. CIE 159-2004. A Colour Appearance Model for Colour Management
Systems: CIECAM02. Vienna, Austria: CIE Central Bureau; 2004.
3. Fu C, Li C, Luo MR, Hunt RWG, Po inter MR. Quantifying Colour
Appearance for Unrelated Colour Under Photopic and Mesopic Vision,
IS&T 15th Color Image Conference, 2007. p 319–324.
4. Fernandez-Maloigne C. Advanced Color Image Processing and Analy-
sis. New York: Springer; 2012. p 19–58.
5. Li HC, Sun PL, Green P. Evaluating Color Appearance and Visual
Comfort of a Living Environment Using a Panoramic Camera, Proceed-
ings of AIC 2012 Interim Meeting, Taipei, 2002.
6. Mart /C19ınez-Verd /C19u F, Pujol J, Capilla P. Characterization of a digital cam-
era as an absolute tristimulus colorimeter. J Imaging Sci Tech 2003;47:
279–295.
7. Wandell BA. Foundations of Vision. Sunderland, MA: Sinauer; 1995.
p7 .
8. Thompson W, Fleming R, Creem-Regehr S, Stefanucci JK. Visual Per-
ception from a Computer Graphics Perspective. Boca Raton: CRC
Press; 2011. p 34.
9. Kingdom FA. Lightness, brightness and transparency: A quarter century
of new ideas, captivating demonstrations and unrelenting controversy.
Vision Res 2011;51:652–673.10. Barten PGJ. Contrast Sensitivity of the Human Eye and Its Effects on
Image Quality. Bellingham, WA: SPIE Press; 1999.
11. Peli E. Contrast in complex images. J Opt Soc Am A 1990;7:
2032–2040.
12. CIE 117-1995. Discomfort Glare in Interior Lighting. Vienna: CIE Cen-
tral Bureau; 1995.
13. Zhang XM, Wandell BA. A spatial extension of CIELAB for digital
color image reproduction. Proc SID Symp 1996;27:731–734.
14. Kuang J, Johnson GM, Fairchild MD. iCAM06: A refined image
appearance model for HDR image rendering. J Vis Commun ImageRepres 2007;18:106–414.
15. Yamaguchi H, Fairchild MD. A Study of Simultaneous Lightness
Perception for Stimuli with Multiple Illumination Levels, 12th Color
Imaging Conference; 2004. p 22–28.
16. Nakamura Y. Generating Perceived Color Image With Wavelet
Transform, Proceedings of Pan-Paci fic Imaging Conference’08; 2008.
p 278–281.
17. Reinhard E, Khan EA, Aky €uz AO, Johnson GM. Color Imaging:
Fundamentals and Applications . Wellesley: A K Peters; 2008.
18. Nayatani Y. Simple estimation methods for the Helmholtz-Kohlrausch
effect. Color Res Appl 1997;22:385–401.
19. Heckamam RL, Fairchild MR, Wyble DR. The Effect of DLP Projector
White Channel on Perceptual Gamut, IS&T/SID 13th Color Imaging
Conference; 2005. p 205–210.
20. Fu C, Luo MR. Methods for Measuring Viewing Parameters in
CIECAM02, IS&T/SID 13th Color Imaging Conference Proceedings.
Springfield VA: IS&T; 2005. pp. 69–74.
21. Yaguchi H, Monma C, Tokunaga K , Miyake Y. Color appearance in
mesopic vision, Color Vision Deficiencies, Tokyo; 1990. p 21.
22. Shin JC, Yaguchi H, Shioiri S. Change of color appearance in photopic,
mesopic and scotopic vision. Opt Rev 2004;11:265–271.
23. Shin JC, Matsuki N, Yaguchi H, Sh ioiri S. A color appearance model
applicable in mesopic vision. Opt Rev 2004;11:272–278.
24. Eloholma M, Halonen L. Performance based model for mesopic
photometry, Report no. 35, Lighting Laboratory, Helsinki University of
Technology; 2005.
25. Bodrogi P. Colour Appearance of Mesopic Related Colors at 0.3, 1, 3
and 10 cd/m
2: Visual Magnitude Estimation and Modeling, in CIE
x039:2014: Proceedings of CIE 2014 Lighting Quality & Energy Effi-
ciency, April 2014, Kuala Lumpur, Malaysia. Vienna: CIE, 2014.
2 6 .F uC ,L iC ,L u oM R ,H u n tR W G ,P o i n t e rM R .A ni n v e s t i g a t i o no fc o l –
our appearance for unrelated colo urs under photopic and mesopic
vision. Color Res Appl 2011;37:238–254.
27. Koo B, Kwak Y. Color appearance and color connotation models for
unrelated colors. Color Res Appl 2015;40:40–49.
28. Withouck M, Smet KA, Ryckaert WR, Deconinck G, Hanselaer P.
Predicting the brightness of unrelated self-luminous stimuli. Opt
Express 2014;22:16298–16309.
29. Withouck M, Smet KA, Ryckaert WR, Hanselaer P. Experimental
driven modelling of the color appearance of unrelated self-luminous
stimuli: CAM15u. Opt Express 2015;23:12045–12064.
30. Luo MR, Hunt RWG. Testing colour appearance models using
corresponding-colour and magnitude-estimation data sets. Color Res
Appl 1998;23:147–153.
31. Withouck M, Smet KA, Hanselaer P. Brightness prediction of different
sized unrelated self-luminous stimuli. Opt Express 2015;23:
13455–13466.
32. Fairchild MD. Color Appearance Models, 3rd edition. New York:
Wiley; 2013. p 123.
Volume 42, Number 4, August 2017 449
View publication statsView publication stats

Similar Posts