Table of Contents

Instructions for Cloud Probe Processing

Cloud Probe Processing Workshop UND Page

SYNTHETIC DATA

Aaron Bansemer has generated synthetic data to simulate the response of the 2D-S, CIP and PIP and these data are now available at:

https://drive.google.com/drive/folders/1ZiinnejLj_U04QBQTtJfzA90Or_tqDmU?usp=sharing

The simulations all use the same assumed exponential size distribution, and there are simulations for both drops and columns. An extra set of synthetic data for the HVPS will be available in the near future (they are taking longer to run because of the large sample volume), and Aaron will update them to the folder when they are complete. Another set of simulations assuming irregular ice particles will also be generated so that there are a range of particles that fit each instrument. These will be added once the HVPS simulations are complete.

In order to make the comparisons most meaningful, it is important that we all follow the same instructions for processing the data. As some of us did a similar exercise 6 years ago for the 2018 University of British Columbia workshop on evaluation of cloud probe processing software, I recommend that we follow the same instructions for this intercomparison. We will compare both particle by particle (PbP) and bulk parameter data. Note that I am attaching a poster presenting at the 2018 AMS Cloud Physics Conference that describes the results of our previous intercomparison.

PROCESSING INSTRUCTIONS

First, please process the data and provide the PbP data for each particle including the following: time, particle count in buffer, width (W), length (L), projected area (A), particle maximum dimension (D), perimeter (P), and a flag for whether the end diodes are shadowed (i.e., 0 1 or 2 depending on whether no, one, or both photodiodes are shadowed). For the first step we will only use the length of the particle along the photodiode array in the comparison (i.e., the width W). However, to make intercomparison easier at later steps, please include all the aforementioned parameters in the file. Note that there are different definitions of particle maximum dimension (i.e., time direction, diode direction, maximum of two, minimum circle completing enclosing particle, area-equivalent diameter, etc.: See Wu and McFarquhar 2016 for a list of possible metrics that can be used to characterize the dimension of the particle): feel free to use whatever dimension you typically use.

Thus, in summary the instructions for the files to provide for the intercomparison are as follows (repeat for each file):

  1. Extract information from the header. Generate a PbP file that contains the image time from header for each particle, count in buffer, W, L, A, D, P, end diode shadow flag.
  2. Determine counts recorded by probe as function of time before any corrections for particle size are made or before the sample volume is computed.
    1. Complete images only (Assume All-in Particles
    2. Partial images only (Assume Particles not All-in
    3. Complete and partial images (Assume sum of the above two
      • UPDATE: We are looking for total counts per second here and not a count size distribution, since the size distribution is generated in step 4.
  3. Determine counts recorded by probe as a function of time after corrections for particle size based on out of focus particles are made (Assume we need to use “water-processing flag in SODA)
    1. Complete images only (Assume All-in Particles
    2. Partial images only (Assume Particles not All-in
    3. Complete and partial images (Assume sum of the above two
  4. Number distribution functions before corrections for particle size (we will use counts per bin per second so that assumptions about the sample volume dependence on maximum dimension don't affect the results).
    • Use 1-second resolution for generating counts;
    • Bins should have widths corresponding to 1 pixel
    • Entries should correspond to number of counts in that bin for given second (should be integer)
    • Files should cover entire length of time in synthetic data files
    • Should include complete and partial images
      • UPDATE: In order for these size distributions to be comparable, we need to use the same definition of dimension. I suggest that we use maximum particle dimension so that we are all using the same definition (Note that there still may be some differences because there are different algorithms for defining maximum dimension). Note also that the simulations have a constant PSD over the full minute, so it is probably easier to aggregate everything into a single distribution rather than breaking down by the second. However, we can handle data submitted either way.
  5. Number distribution function after corrections for particle sizes (we will specify a priori the sample volume)
    1. Use same instructions as in step 4 and include complete and partial images
  6. Number distribution function after removal of artifacts (based on particle morphology)
  7. Number distribution function after removal of artifacts (based on particle morphology) and shattered particles (based on interarrival time)
  8. Number distribution function after removal of artifacts (based on particle morphology) and shattered particles (based on interarrival time) and reacceptance of spuriously removed particles.
    • UPDATE: Aaron reminded me that there is no shattering in the simulated datasets. Thus, steps 7/8 will probably not yield much in results (though if you have time to complete these steps, you will see how many non-artifact particles get removed!)

It is possible we might not have time to complete all steps before the workshop. Please send your files after we complete each step so that we can intercompare the distributions on a step-by-step basis. We anticipate that these comparisons will be taking place continuously before we meet in Jeju to discuss the final comparisons. It will be very helpful to start receiving files soon.

It would be best if you could provide netcdf files with the output from your code. As there are two sets of files for each probe (2DS, HVPS-3, CIP, PIP), one with simulated round drops and another for simulated columns. It would be best to supply separate files for each probe. To allow us to more easily interpret who supplied what files, we recommend the following naming convention for your supplied files: XXX_PPP_Z_S#_YY.cdf where XXX is a unique 3 digit code (e.g., UIO for University of Illinois/Oklahoma software), PPP represents the probe (i.e., 2DS, HVP, CIP or PIP), Z represents shape (i.e., either C for columns or R for round particles), S# represents the step number (i.e., use S1 for step 1 giving counts along photodiode array) and YY represents type of particles included (i.e., CO for complete, PA for partial and CP for complete + partial).

Ideally we would like the PbP files by the end of March, and the sized distribution files by the end of April so that there is lots of time to perform the comparison before the Jeju workshop (and get supplementary information if needed). TOnce we start receiving files, we will start generating the intercomparison plots and let you know any pertinent information as we move to subsequent steps, and there is always a possibility additional information might be needed.

Please place your processed files on: https://drive.google.com/drive/folders/102GSajs4nEwh48OJvUzcuS00dbClBEaj?usp=sharing . Please send an email to Mastooreh Ameri (maameri@ou.edu) when you have placed files there so that she can check the files.

UPDATE: I've also been requested to add an example file with the PSD products (Steps 2-8) so that they can be helpful as a guide. I'm providing a link here to the 2018 drive that we used for the comparison, that shows good examples of the data files. https://drive.google.com/drive/folders/1uhTJJRcqXLFpz0z6QKXgqlPJnf-MTT27

Cloud Probe Processing Workshop UND Page