Clifford Plavin from Progressive Labs

Started by J.A.F._Doorhof, August 23, 2009, 09:39:04

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.


Will start here very soon.

Clifford is one of the guys from Progressive Labs.
They manufactor some of the best analyzers in the world for the ISF calibrators.

We recently upgraded to their top of the line Microspec XL spectrum radio meter and asked Cliff if he would be available for some questions from our visitors.
So please join in and give Cliff some questions on calibrations, their meters, company or whatever you want to know.   /   ISF & HAA certified
Custom installer

Volledige ISF calibraties inclusief HDR en 4K.

"Omdat je je iets niet kan voorstellen betekent dat niet dat het niet kan gebeuren"


Let me start with the first questions:

Cliff welcome,
Can you tell our readers a bit about your company and the products you sell/manufactor.

Also can you give us your view on the calibration market.   /   ISF & HAA certified
Custom installer

Volledige ISF calibraties inclusief HDR en 4K.

"Omdat je je iets niet kan voorstellen betekent dat niet dat het niet kan gebeuren"

Cliff Plavin

Progressive Labs has been involved in building software to support popular tristimulus filter based color analyzers since 1999.  Our software provides a number of unique features designed to allow the calibrator to work more efficiently and provide highly accurate results on all types of displays.  

Over the years display technologies have changed drastically with the introduction of LCD (including recent LED back-light models), plasma flat panel displays, DLP/LCD projectors as well as Laser based displays (Mitsubishi).  The new display technologies do not typically have primaries with chromaticity coordinates even remotely close to that of CRT technologies of the past.  As a result the tristimulus analyzers which were adequate at one time for display analysis and calibration are no longer accurate enough for our needs.

We currently support more then thirty different instruments including those from GretagMacBeth, Sequel Imaging, X-RIte, Konica Minolta, PhotoResearch, as well as pattern generators from Accupel, Sencore, and our own recently released broadcast model generator.  The MicroSpec spectroradiometer is an instrument which we have built to our own specifications to provide data that is accurate to very low light levels with short integration times.  The MicroSpec provides the same accuracy as far more expensive instruments such as the Minolta CS-2000 which is far more costly and not something which the typical home theater calibrator will be found carrying in their bag.  The standard model Microspec measures to as low as .25 fL with a typical integration time taking a maximum of seven seconds with spectral bandwidth of 5.0 nm and optical bandwidth of 1.0 nm per per pixel.

The specifications of the MicroSpec are virtually identical to the CS-2000 in every respect however we are able to measure to equivalent light levels at far shorter integration times with no loss in accuracy!  The MicroSpec comes calibrated for use with an optical fiber front end attached to the instrument with a type FC indexed connector.  All displays may be measured directly from the screen with the bare fiber.  When measuring an LCD flat panel the supplied screw-on aperture is attached reducing the field of view to eliminate off axis viewing errors.  An optional cosine receptor may be purchased for those that require the ability to measure directly from the lens of a projector.  Separate calibrations are accessed via the software for each of the listed configurations (bare fiber, LCD Aperture, Cosine Receptor).

The MicroSpec XL has recently been introduced and provides the same features as the standard model however it is capable of measuring to a much lower light level which is verified for accuracy to .05 fL.

As the market for display calibration is predominantly digital devices the filter based analyzers no longer provide adequate accuracy for many of customers models which we come across on a daily basis.  This has forced calibrators to seek out alternate measurement solutions.  A number of users have tried low cost spectroradiometers however, these products typically have poor low light level performance and inadequate bandwidth to accurately measure current displays.   The lack of a proper instrument which was affordable for the professional calibrator to use was what prompted us to design a product that would adequately perform in this environment.  The MicroSpec offers the performance of instruments costing 3-4 times its price with no sacrifice in measurement accuracy.

When using our software package which integrates the color analyzer with a video pattern generator  a host of automated measurement routines are provided greatly reducing the time required for testing and eliminating errors.  We have recently completed a few new functions which with the touch of one button allow the software to display the target brightness levels and chromaticity coordinates for 12 rgb transformation matrices.  A drop down menu allows the user to select the desired % brightness level from 0-100% which in turn adjusts the target values accordingly.  By pressing a button the user may select the target for R,G,B,C,M,Y,W and have all of the data presented on the main panel instantly!

I would be happy to answer any questions which forum users are interested in regarding color analyzers, pattern generators, and calibration issues in general.  I look forward to your questions and will be checking in frequently to the forum to respond to your questions for the rest of the week.


Cliff Plavin
Progressive Labs



Minolta claims its 2000 model works on lower lightlevels than the 0.14 cd/m2 you spec above,

You say it is much quicker than Minolta's second highest in its range meter, Minolta claims 1-16 seconds, could expand upon what integration time means?

Please, expand upon the bandwidth issues, you quote two bandwidth values, how does bandwidth come into play when trying to obtain accurate measurements of color and light?

Cliff Plavin

The standard model Microspec is capable of measuring to a light level of .25 fL which is just a bit lower then the CS-1000 model which is what the CS-2000 replaced.  The MicroSpec XL extends the low light level to .05 fL which is not as low as the CS-2000 however it is more then adequate for measurement and calibration of typical video displays of all types.  The price for the Standard MicroSpec is $8,000.00 and the XL version is $10,000.00  compared to the CS-2000 which is approximately $25,000.00.  The specification for the CS-2000 regarding measurement time is from 1 to 243 seconds depending on if it is set to Manual or Standard mode.  The lowest light level which it is able to measure at requires 243 seconds which is a very long time to wait between making a single adjustment on a video display and the next measurement to see what change you just made with the revised settings.  The MicroSpec operates in close to real time so that you are typically never waiting more then a few seconds worst case for the lowest light level measurements.

Integration time is the length of time needed to perform a measurement at a specific light level. Measurement time adds add to this the dark measurement time which is the identical length period to subtract out the signal noise of the instrument from the reading at a specific integration time, then add processing time.  

The MicroSpec integration time ranges from 1ms to 5000ms depending on the light level and binning mode which is set automatically from 1-4 pixels as light level decreases.  The measurement time consists of the following elements in any spectroradiometer integration time + Dark Time + processing time = measurement time. Our spectroradiometer uses a back- thinnned  CCD detector which contains 2048 pixels which allows us to combine pixels in a process called binning.  As light level decreases to improve light throughput we are able to measure to lower light levels and keep measurement periods shorter.  We perform the binning function automatically via our software as the light level drops and go from 2048 to 1024 pixels, then  682 pixels and finally 512 pixels.

When at 512 pixels we still provide higher optical bandwidth then the reference instrument (Minolta CS-1000/2000) which has .9nm/pixel optical bandwidth.  The way to calculate the optical bandwidth is take the wavelength range which the instrument operates within (380nm -  780 nm) which is 400nm range and divide it by the total number of pixels in use on the detector 2048  400/2048= .1953 which when multiplied by 4 pixels (binning of 4 still maintains a bandwidth of .78125 nm/pixel.  When you normalize the data to 1.0nm/pixel which is what the Minolta returns then you are able to maintain perfect agreement with the reference instrument.

To provide an accurate measurement of a display which has a complex spectral output such as an LCD flat panel display or other lamp based display you must have a bandwidth of better then 5.0 nm per pixel.  When using an instrument which provides less bandwidth ie 10nm/pixel (for example an X-Rite i1Pro) the data may contain included errors as spectral peaks which lie closer together then the 10nm specification will be seen by the instrument as one peak.  When this spectral data is converted to XYZ and then to xyY data the error will be realized.  The user will not see the error as they calibrate to the chromaticity values displayed by the 10 nm instrument until they have completed their calibration which may be a huge color error.

The wide bandwidth instrument will not have an issue with the same spikey spectral  display as the adjacent peaks will rarely lie as close together as 5.0nm per pixel. This issue is also observed when measuring laser based displays such as the Mitsubishi Laservue RPTV.  As display technologies are evolving at a very rapid pace it is easier to purchase one instrument that has the ability to measure all displays equally well today rather then making an upgrade periodically as technologies change.  You have no "accurate point" of reference when claiming that your calibration is within a certain tolerance when using anything less then a high bandwidth spectroradiometer as the calibration is likely to have an included error which is of an unknown quantity.



Let me also put something down :D

As you know we do alot of calibrations in the Netherlands.
A lot of people are always asking the same questions, is it really necessary and why doesn't a manufactor does it straight from the factory.
And can we just copy settings from one machine to another.

Our answer is always simple.
Every machine is different so you need to do the calibration on each machine after a small run in time.
And yes the differences from calibrations can range from day to night to more than worth the effort.

But maybe it's informative to also shed your light on this ?   /   ISF & HAA certified
Custom installer

Volledige ISF calibraties inclusief HDR en 4K.

"Omdat je je iets niet kan voorstellen betekent dat niet dat het niet kan gebeuren"

Cliff Plavin

Manufacturers typically do not wish to spend the time to properly calibrate displays so that they conform to standards as it requires time which in the manufacturing world translates to "expensive". 

The reason why users cannot simply copy settings from one display to use on another of the same make and model is that the tolerances of components used vary from one unit to the next and need individual adjustment.  The plasma panels are made in a process which takes approximately one month for the mother glass to pass through a conveyor which essentially "bakes" the material.  The glass is constantly being monitored by a technician which makes small changes to the process while it is being made. 

When the mother glass is completed and exits the "oven" it is cut and assembled by the manufacturer and must then be adjusted to provide a proper picture.  The manufacturers simply inspect and calibrate "a lot" of displays which may be a dozen or so units out of a production quantity of thousands of pieces to see how they look.  Once the inspection is made and the "inspection lot' displays are "calibrated" to meet the manufacturers specifications the same settings from the "inspection lot" are flashed to the balance of the panels.  Each of the panels within a specific lot are fairly similar to each other however as noted earlier the glass is in a constant state of flux while being manufactured so glass exiting the oven on the 1st is not similar to glass exiting on the 28th of the month.

Attempting to use calibrations values from the same make and model of display as a friends unit that has been calibrated may make your display in fact look bad!  The glass and other internal settings may differ substantially from the reference unit and this is why they will never appear the same without having a complete calibration performed on each piece.  The same type of problems exist with other technologies such as lamp based displays as the lamps are not identical and age differently providing different spectral characteristics which require individual calibration to correct.

Video projectors which use a variety of lamp technologies such as UHP halogen lamps, short arc halogen lamps, Xenon lamps, etc. all produce substantially different spectral outputs from one another and effect the picture quality.  The lamps degrade initially at an unstable rate which is typically in the first 100 hours of use.  A projector should not be calibrated until the lamp has been broken in for at least this period of time as the calibration will drift during this period.  Once the lamp becomes stable after  break-in the calibration may be performed and the display will stay within a very close tolerance for a long period of time.  Once broken in, lamps typically remain spectrally stable and simply lose output power (brightness) over time.  This happens at a very gradual rate of time so that it is not noticeable by the viewer until a fairly significant drop in brightness has taken place.  This is obviously the point when a user decides to replace the lamp on the projector.   


To be complete.
For projectors we advise to calibrate at 200.1000.2000 hours but really advise to change the lamp at 1500.
Good te see your reply, it's 100% my opinion and experience with people using settings from other sets.
Some look nicer than from the box but most look very bad.
We have a trend with Pioneer plasmas using settings found on the net and most of them are worse than if the user would have used the cinema mode straight from the box.   /   ISF & HAA certified
Custom installer

Volledige ISF calibraties inclusief HDR en 4K.

"Omdat je je iets niet kan voorstellen betekent dat niet dat het niet kan gebeuren"


As you are the first to release an analyser with these properties in this price-category, what developments are you working on to stay ahead of the competition?
For people who do not calibrate for a living but do like to have good (the best) measuring equipment available, will there be a lower priced version of the Microspec available in the future that allows small-spectrum 2-5nm sampling?

2 * Barco Cine 9 | Da-Lite tab-tensioned 3m | Crystalio II | Meridian 568.2MM | Meridian HD621 | Chord SPM 2000 | Pioneer HLD-X9 | PS3
Pioneer LX5090 | Dune BD Prime 3.0

Cliff Plavin

We could build an instrument which would provide the optical bandwidth specifications which you mention (2nm or 5nm).  Our feeling is that a 5nm instrument does not make much sense as it is only marginally better then a 10nm instrument such as the i1Pro.  With new display technologies emerging rapidly which have very narrow spectral peaks such as laser and LED lighting which require both high spectral  bandwidth and high optical bandwidth the 2nm solution is something that we can look into building.

You can only work with certain detectors and electronics packages which provide very low noise levels otherwise you are unable to measure to a low enough light level to make the instrument worth owning.  I will investigate if there are any detector/electronics packages which make sense that will reduce the cost significantly so as to broaden the appeal to the serious enthusiast.  The lower bandwidth instruments are able to collect more light and read to a lower light level at the expense of accuracy.  An example is an i1Pro which measures to approximately 1.0 fL due to its 10nm bandwidth, if the bandwidth was improved to 5nm the light level would be cut in half and have measurement capability to only 2.0 fL which is obviously not appropriate for video display analysis.   The optical resolution of 10nm per pixel is also not fine enough to deal with several of the new digital displays as it will introduce errors in the measurement.

Changing the entrance slit to allow more light to enter the instrument will reduce the optical resolution and also allow the use of a larger diameter fiber so it is possible to build an instrument with different characteristics with each change having an effect on a specific area of the instrument.  We can not achieve the type of performance that a user would expect for display analysis with spectroradiometers that are available for $3K as they only provide 300:1 signal to noise ratio which is very poor.  These are instruments that contain detectors such as the Sony ILX 511 and comparable Toshiba detectors.