Start line:  
End line:  

Snippet Preview

Snippet HTML Code

Stack Overflow Questions
Written by Gil Tene of Azul Systems, and released to the public domain, as explained at http://creativecommons.org/publicdomain/zero/1.0/

Author(s):
Gil Tene
   
   
   package org.HdrHistogram;
   
  import java.io.*;
  import java.nio.*;
  import java.util.Locale;
This non-public AbstractHistogramBase super-class separation is meant to bunch "cold" fields separately from "hot" fields, in an attempt to force the JVM to place the (hot) fields commonly used in the value recording code paths close together. Subclass boundaries tend to be strongly control memory layout decisions in most practical JVM implementations, making this an effective method for control filed grouping layout.
  
  
  abstract class AbstractHistogramBase extends EncodableHistogram {
      static AtomicLong constructionIdentityCount = new AtomicLong(0);
  
      // "Cold" accessed fields. Not used in the recording code path:
      long identity;
      volatile boolean autoResize = false;
  
      long highestTrackableValue;
      long lowestDiscernibleValue;
  
      int bucketCount;
      int subBucketCount;
      int countsArrayLength;
      int wordSizeInBytes;
  
      long startTimeStampMsec = .;
      long endTimeStampMsec = 0;
  
      double integerToDoubleValueConversionRatio = 1.0;
  
  
  
          return ;
      }
  
      void setIntegerToDoubleValueConversionRatio(double integerToDoubleValueConversionRatio) {
          this. = integerToDoubleValueConversionRatio;
      }
  }

An abstract base class for integer values High Dynamic Range (HDR) Histograms

AbstractHistogram supports the recording and analyzing sampled data value counts across a configurable integer value range with configurable value precision within the range. Value precision is expressed as the number of significant digits in the value recording, and provides control over value quantization behavior across the value range and the subsequent value resolution at any given level.

For example, a Histogram could be configured to track the counts of observed integer values between 0 and 3,600,000,000 while maintaining a value precision of 3 significant digits across that range. Value quantization within the range will thus be no larger than 1/1,000th (or 0.1%) of any value. This example Histogram could be used to track and analyze the counts of observed response times ranging between 1 microsecond and 1 hour in magnitude, while maintaining a value resolution of 1 microsecond up to 1 millisecond, a resolution of 1 millisecond (or better) up to one second, and a resolution of 1 second (or better) up to 1,000 seconds. At it's maximum tracked value (1 hour), it would still maintain a resolution of 3.6 seconds (or better).

See package description for org.HdrHistogram for details.

  
  
  public abstract class AbstractHistogram extends AbstractHistogramBase implements Serializable {
  
      // "Hot" accessed fields (used in the the value recording code path) are bunched here, such
      // that they will have a good chance of ending up in the same cache line as the totalCounts and
      // counts array reference fields that subclass implementations will typically add.
      int leadingZeroCountBase;
      int unitMagnitude;
      int subBucketHalfCount;
      long subBucketMask;
      volatile long maxValue = 0;
      volatile long minNonZeroValue = .;
  
      private static final AtomicLongFieldUpdater<AbstractHistogrammaxValueUpdater =
              AtomicLongFieldUpdater.newUpdater(AbstractHistogram.class"maxValue");
             AtomicLongFieldUpdater.newUpdater(AbstractHistogram.class"minNonZeroValue");
 
     // Sub-classes will typically add a totalCount field and a counts array field, which will likely be laid out
     // right around here due to the subclass layout rules in most practical JVM implementations.
 
     //
     //
     //
     // Abstract, counts-type dependent methods to be provided by subclass implementations:
     //
     //
     //
 
     abstract long getCountAtIndex(int index);
 
     abstract long getCountAtNormalizedIndex(int index);
 
     abstract void incrementCountAtIndex(int index);
 
     abstract void addToCountAtIndex(int indexlong value);
 
     abstract void setCountAtIndex(int indexlong value);
 
     abstract void setCountAtNormalizedIndex(int indexlong value);
 
     abstract int getNormalizingIndexOffset();
 
     abstract void setNormalizingIndexOffset(int normalizingIndexOffset);
 
     abstract void shiftNormalizingIndexByOffset(int offsetToAddboolean lowestHalfBucketPopulated);
 
     abstract void setTotalCount(long totalCount);
 
     abstract void incrementTotalCount();
 
     abstract void addToTotalCount(long value);
 
     abstract void clearCounts();
 
     abstract int _getEstimatedFootprintInBytes();
 
     abstract void resize(long newHighestTrackableValue);

    
Get the total count of all recorded values in the histogram

Returns:
the total count of all recorded values in the histogram
 
     abstract public long getTotalCount();

    
Set internally tracked maxValue to new value if new value is greater than current one. May be overridden by subclasses for synchronization or atomicity purposes.

Parameters:
value new maxValue to set
 
     void updatedMaxValue(final long value) {
         while (value > ) {
             .compareAndSet(thisvalue);
         }
     }
 
     final void resetMaxValue(final long maxValue) {
         this. = maxValue;
     }

    
Set internally tracked minNonZeroValue to new value if new value is smaller than current one. May be overridden by subclasses for synchronization or atomicity purposes.

Parameters:
value new minNonZeroValue to set
 
     void updateMinNonZeroValue(final long value) {
         while (value < ) {
             .compareAndSet(thisvalue);
         }
     }
 
     void resetMinNonZeroValue(final long minNonZeroValue) {
         this. = minNonZeroValue;
     }
 
     //
     //
     //
     // Construction:
     //
     //
     //
 
    
Construct an auto-resizing histogram with a lowest discernible value of 1 and an auto-adjusting highestTrackableValue. Can auto-resize up to track values up to (Long.MAX_VALUE / 2).

Parameters:
numberOfSignificantValueDigits The number of significant decimal digits to which the histogram will maintain value resolution and separation. Must be a non-negative integer between 0 and 5.
 
     protected AbstractHistogram(final int numberOfSignificantValueDigits) {
         this(1, 2, numberOfSignificantValueDigits);
          = true;
     }

    
Construct a histogram given the Lowest and Highest values to be tracked and a number of significant decimal digits. Providing a lowestDiscernibleValue is useful is situations where the units used for the histogram's values are much smaller that the minimal accuracy required. E.g. when tracking time values stated in nanosecond units, where the minimal accuracy required is a microsecond, the proper value for lowestDiscernibleValue would be 1000.

Parameters:
lowestDiscernibleValue The lowest value that can be discerned (distinguished from 0) by the histogram. Must be a positive integer that is >= 1. May be internally rounded down to nearest power of 2.
highestTrackableValue The highest value to be tracked by the histogram. Must be a positive integer that is >= (2 * lowestDiscernibleValue).
numberOfSignificantValueDigits The number of significant decimal digits to which the histogram will maintain value resolution and separation. Must be a non-negative integer between 0 and 5.
 
     protected AbstractHistogram(final long lowestDiscernibleValuefinal long highestTrackableValue,
                                 final int numberOfSignificantValueDigits) {
         // Verify argument validity
         if (lowestDiscernibleValue < 1) {
             throw new IllegalArgumentException("lowestDiscernibleValue must be >= 1");
         }
         if (highestTrackableValue < 2L * lowestDiscernibleValue) {
             throw new IllegalArgumentException("highestTrackableValue must be >= 2 * lowestDiscernibleValue");
         }
         if ((numberOfSignificantValueDigits < 0) || (numberOfSignificantValueDigits > 5)) {
             throw new IllegalArgumentException("numberOfSignificantValueDigits must be between 0 and 5");
         }
 
         init(lowestDiscernibleValuehighestTrackableValuenumberOfSignificantValueDigits, 1.0, 0);
     }

    
Construct a histogram with the same range settings as a given source histogram, duplicating the source's start/end timestamps (but NOT it's contents)

Parameters:
source The source histogram to duplicate
 
     protected AbstractHistogram(final AbstractHistogram source) {
         this(source.getLowestDiscernibleValue(), source.getHighestTrackableValue(),
                 source.getNumberOfSignificantValueDigits());
         this.setStartTimeStamp(source.getStartTimeStamp());
         this.setEndTimeStamp(source.getEndTimeStamp());
         this. = source.autoResize;
     }
 
     @SuppressWarnings("deprecation")
     private void init(final long lowestDiscernibleValue,
                       final long highestTrackableValue,
                       final int numberOfSignificantValueDigits,
                       final double integerToDoubleValueConversionRatio,
                       final int normalizingIndexOffset) {
         this. = lowestDiscernibleValue;
         this. = highestTrackableValue;
         this. = numberOfSignificantValueDigits;
         this. = integerToDoubleValueConversionRatio;
         if (normalizingIndexOffset != 0) {
             setNormalizingIndexOffset(normalizingIndexOffset);
         }
 
         final long largestValueWithSingleUnitResolution = 2 * (long) Math.pow(10, numberOfSignificantValueDigits);
 
          = (int) Math.floor(Math.log(lowestDiscernibleValue)/Math.log(2));
 
         // We need to maintain power-of-two subBucketCount (for clean direct indexing) that is large enough to
         // provide unit resolution to at least largestValueWithSingleUnitResolution. So figure out
         // largestValueWithSingleUnitResolution's nearest power-of-two (rounded up), and use that:
         int subBucketCountMagnitude = (int) Math.ceil(Math.log(largestValueWithSingleUnitResolution)/Math.log(2));
          = ((subBucketCountMagnitude > 1) ? subBucketCountMagnitude : 1) - 1;
          = (int) Math.pow(2, ( + 1));
          =  / 2;
          = ((long) - 1) << ;
 
 
         // determine exponent range needed to support the trackable value with no overflow:
         establishSize(highestTrackableValue);
 
         // Establish leadingZeroCountBase, used in getBucketIndex() fast path:
          = 64 -  -  - 1;
 
          = new PercentileIterator(this, 1);
          = new RecordedValuesIterator(this);
     }
 
     final void establishSize(long newHighestTrackableValue) {
         // establish counts array length:
          = determineArrayLengthNeeded(newHighestTrackableValue);
         // establish exponent range needed to support the trackable value with no overflow:
          = getBucketsNeededToCoverValue(newHighestTrackableValue);
         // establish the new highest trackable value:
          = newHighestTrackableValue;
     }
 
     final int determineArrayLengthNeeded(long highestTrackableValue) {
         if (highestTrackableValue < 2L * ) {
             throw new IllegalArgumentException("highestTrackableValue (" + highestTrackableValue +
                     ") cannot be < (2 * lowestDiscernibleValue)");
         }
         //determine counts array length needed:
         int countsArrayLength = getLengthForNumberOfBuckets(getBucketsNeededToCoverValue(highestTrackableValue));
         return countsArrayLength;
     }
 
     //
     //
     // Auto-resizing control:
     //
     //
 
     public boolean isAutoResize() {
         return ;
     }
 
     public void setAutoResize(boolean autoResize) {
         this. = autoResize;
     }
 
     //
     //
     //
     // Value recording support:
     //
     //
     //
 
    
Record a value in the histogram

Parameters:
value The value to be recorded
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if value is exceeds highestTrackableValue
 
     public void recordValue(final long valuethrows ArrayIndexOutOfBoundsException {
         recordSingleValue(value);
     }

    
Record a value in the histogram (adding to the value's current count)

Parameters:
value The value to be recorded
count The number of occurrences of this value to record
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if value is exceeds highestTrackableValue
 
     public void recordValueWithCount(final long valuefinal long countthrows ArrayIndexOutOfBoundsException {
         recordCountAtValue(countvalue);
     }

    
Record a value in the histogram.

To compensate for the loss of sampled values when a recorded value is larger than the expected interval between value samples, Histogram will auto-generate an additional series of decreasingly-smaller (down to the expectedIntervalBetweenValueSamples) value records.

Note: This is a at-recording correction method, as opposed to the post-recording correction method provided by copyCorrectedForCoordinatedOmission(long). The two methods are mutually exclusive, and only one of the two should be be used on a given data set to correct for the same coordinated omission issue.

See notes in the description of the Histogram calls for an illustration of why this corrective behavior is important.

Parameters:
value The value to record
expectedIntervalBetweenValueSamples If expectedIntervalBetweenValueSamples is larger than 0, add auto-generated value records as appropriate if value is larger than expectedIntervalBetweenValueSamples
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if value is exceeds highestTrackableValue
 
     public void recordValueWithExpectedInterval(final long valuefinal long expectedIntervalBetweenValueSamples)
             throws ArrayIndexOutOfBoundsException {
         recordSingleValueWithExpectedInterval(valueexpectedIntervalBetweenValueSamples);
     }

    

Deprecated:
Record a value in the histogram. This deprecated method has identical behavior to recordValueWithExpectedInterval(). It was renamed to avoid ambiguity.
Parameters:
value The value to record
expectedIntervalBetweenValueSamples If expectedIntervalBetweenValueSamples is larger than 0, add auto-generated value records as appropriate if value is larger than expectedIntervalBetweenValueSamples
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if value is exceeds highestTrackableValue
 
     public void recordValue(final long valuefinal long expectedIntervalBetweenValueSamples)
             throws ArrayIndexOutOfBoundsException {
         recordValueWithExpectedInterval(valueexpectedIntervalBetweenValueSamples);
     }
 
     private void updateMinAndMax(final long value) {
         if (value > ) {
             updatedMaxValue(value);
         }
         if ((value < ) && (value != 0)) {
             updateMinNonZeroValue(value);
         }
     }
 
     private void recordCountAtValue(final long countfinal long value)
             throws ArrayIndexOutOfBoundsException {
         int countsIndex = countsArrayIndex(value);
         try {
             addToCountAtIndex(countsIndexcount);
         } catch (ArrayIndexOutOfBoundsException ex) {
             handleRecordException(countvalueex);
         } catch (IndexOutOfBoundsException ex) {
             handleRecordException(countvalueex);
         }
         updateMinAndMax(value);
         addToTotalCount(count);
     }
 
     private void recordSingleValue(final long valuethrows ArrayIndexOutOfBoundsException {
         int countsIndex = countsArrayIndex(value);
         try {
             incrementCountAtIndex(countsIndex);
         } catch (ArrayIndexOutOfBoundsException ex) {
             handleRecordException(1, valueex);
         } catch (IndexOutOfBoundsException ex) {
             handleRecordException(1, valueex);
         }
         updateMinAndMax(value);
         incrementTotalCount();
     }
 
     private void handleRecordException(final long countfinal long valueException ex) {
         if (!) {
             throw new ArrayIndexOutOfBoundsException("value outside of histogram covered range. Caused by: " + ex);
         }
         resize(value);
         int countsIndex = countsArrayIndex(value);
         addToCountAtIndex(countsIndexcount);
     }
 
     private void recordValueWithCountAndExpectedInterval(final long valuefinal long count,
                                                          final long expectedIntervalBetweenValueSamples)
             throws ArrayIndexOutOfBoundsException {
         recordCountAtValue(countvalue);
         if (expectedIntervalBetweenValueSamples <= 0)
             return;
         for (long missingValue = value - expectedIntervalBetweenValueSamples;
              missingValue >= expectedIntervalBetweenValueSamples;
              missingValue -= expectedIntervalBetweenValueSamples) {
             recordCountAtValue(countmissingValue);
         }
     }
 
     private void recordSingleValueWithExpectedInterval(final long value,
                                                        final long expectedIntervalBetweenValueSamples)
             throws ArrayIndexOutOfBoundsException {
         recordSingleValue(value);
         if (expectedIntervalBetweenValueSamples <= 0)
             return;
         for (long missingValue = value - expectedIntervalBetweenValueSamples;
              missingValue >= expectedIntervalBetweenValueSamples;
              missingValue -= expectedIntervalBetweenValueSamples) {
             recordSingleValue(missingValue);
         }
     }
 
     //
     //
     //
     // Clearing support:
     //
     //
     //
 
    
Reset the contents and stats of this histogram
 
     public void reset() {
         clearCounts();
         resetMaxValue(0);
         setNormalizingIndexOffset(0);
     }
 
     //
     //
     //
     // Copy support:
     //
     //
     //
 
    
Create a copy of this histogram, complete with data and everything.

Returns:
A distinct copy of this histogram.
 
     abstract public AbstractHistogram copy();

    
Get a copy of this histogram, corrected for coordinated omission.

To compensate for the loss of sampled values when a recorded value is larger than the expected interval between value samples, the new histogram will include an auto-generated additional series of decreasingly-smaller (down to the expectedIntervalBetweenValueSamples) value records for each count found in the current histogram that is larger than the expectedIntervalBetweenValueSamples. Note: This is a post-correction method, as opposed to the at-recording correction method provided by recordValueWithExpectedInterval. The two methods are mutually exclusive, and only one of the two should be be used on a given data set to correct for the same coordinated omission issue. by

See notes in the description of the Histogram calls for an illustration of why this corrective behavior is important.

Parameters:
expectedIntervalBetweenValueSamples If expectedIntervalBetweenValueSamples is larger than 0, add auto-generated value records as appropriate if value is larger than expectedIntervalBetweenValueSamples
Returns:
a copy of this histogram, corrected for coordinated omission.
 
     abstract public AbstractHistogram copyCorrectedForCoordinatedOmission(long expectedIntervalBetweenValueSamples);

    
Copy this histogram into the target histogram, overwriting it's contents.

Parameters:
targetHistogram the histogram to copy into
 
     public void copyInto(final AbstractHistogram targetHistogram) {
         targetHistogram.reset();
         targetHistogram.add(this);
         targetHistogram.setStartTimeStamp(this.);
         targetHistogram.setEndTimeStamp(this.);
     }

    
Copy this histogram, corrected for coordinated omission, into the target histogram, overwriting it's contents. (see copyCorrectedForCoordinatedOmission(long) for more detailed explanation about how correction is applied)

Parameters:
targetHistogram the histogram to copy into
expectedIntervalBetweenValueSamples If expectedIntervalBetweenValueSamples is larger than 0, add auto-generated value records as appropriate if value is larger than expectedIntervalBetweenValueSamples
 
     public void copyIntoCorrectedForCoordinatedOmission(final AbstractHistogram targetHistogram,
                                                         final long expectedIntervalBetweenValueSamples) {
         targetHistogram.reset();
         targetHistogram.addWhileCorrectingForCoordinatedOmission(thisexpectedIntervalBetweenValueSamples);
         targetHistogram.setStartTimeStamp(this.);
         targetHistogram.setEndTimeStamp(this.);
     }
 
     //
     //
     //
     // Add support:
     //
     //
     //
 
    
Add the contents of another histogram to this one.

As part of adding the contents, the start/end timestamp range of this histogram will be extended to include the start/end timestamp range of the other histogram.

Parameters:
otherHistogram The other histogram.
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if values in fromHistogram's are higher than highestTrackableValue.
 
     public void add(final AbstractHistogram otherHistogramthrows ArrayIndexOutOfBoundsException {
         long highestRecordableValue = highestEquivalentValue(valueFromIndex( - 1));
         if (highestRecordableValue < otherHistogram.getMaxValue()) {
             if (!isAutoResize()) {
                 throw new ArrayIndexOutOfBoundsException(
                         "The other histogram includes values that do not fit in this histogram's range.");
             }
             resize(otherHistogram.getMaxValue());
         }
         if (( == otherHistogram.bucketCount) &&
                 ( == otherHistogram.subBucketCount) &&
                 ( == otherHistogram.unitMagnitude) &&
                 (getNormalizingIndexOffset() == otherHistogram.getNormalizingIndexOffset())) {
             // Counts arrays are of the same length and meaning, so we can just iterate and add directly:
             long observedOtherTotalCount = 0;
             for (int i = 0; i < otherHistogram.countsArrayLengthi++) {
                 long otherCount = otherHistogram.getCountAtIndex(i);
                 if (otherCount > 0) {
                     addToCountAtIndex(iotherCount);
                     observedOtherTotalCount += otherCount;
                 }
             }
             setTotalCount(getTotalCount() + observedOtherTotalCount);
             updatedMaxValue(Math.max(getMaxValue(), otherHistogram.getMaxValue()));
             updateMinNonZeroValue(Math.min(getMinNonZeroValue(), otherHistogram.getMinNonZeroValue()));
         } else {
             // Arrays are not a direct match, so we can't just stream through and add them.
             // Instead, go through the array and add each non-zero value found at it's proper value:
             for (int i = 0; i < otherHistogram.countsArrayLengthi++) {
                 long otherCount = otherHistogram.getCountAtIndex(i);
                 if (otherCount > 0) {
                     recordValueWithCount(otherHistogram.valueFromIndex(i), otherCount);
                 }
             }
         }
         setStartTimeStamp(Math.min(otherHistogram.startTimeStampMsec));
         setEndTimeStamp(Math.max(otherHistogram.endTimeStampMsec));
     }

    
Subtract the contents of another histogram from this one.

The start/end timestamps of this histogram will remain unchanged.

Parameters:
otherHistogram The other histogram.
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if values in otherHistogram's are higher than highestTrackableValue.
 
     public void subtract(final AbstractHistogram otherHistogram)
             throws ArrayIndexOutOfBoundsExceptionIllegalArgumentException {
         long highestRecordableValue = valueFromIndex( - 1);
         if (highestRecordableValue < otherHistogram.getMaxValue()) {
             if (!isAutoResize()) {
                 throw new ArrayIndexOutOfBoundsException(
                         "The other histogram includes values that do not fit in this histogram's range.");
             }
             resize(otherHistogram.getMaxValue());
         }
         if (( == otherHistogram.bucketCount) &&
                 ( == otherHistogram.subBucketCount) &&
                 ( == otherHistogram.unitMagnitude) &&
                 (getNormalizingIndexOffset() == otherHistogram.getNormalizingIndexOffset())) {
             // Counts arrays are of the same length and meaning, so we can just iterate and add directly:
             long observedOtherTotalCount = 0;
             for (int i = 0; i < otherHistogram.countsArrayLengthi++) {
                 long otherCount = otherHistogram.getCountAtIndex(i);
                 if (otherCount > 0) {
                     if (getCountAtIndex(i) < otherCount) {
                         throw new IllegalArgumentException("otherHistogram count (" + otherCount + ") at value " +
                                 valueFromIndex(i) + " is larger than this one's (" + getCountAtIndex(i) + ")");
                     }
                     addToCountAtIndex(i, -otherCount);
                     observedOtherTotalCount += otherCount;
                 }
             }
             setTotalCount(getTotalCount() - observedOtherTotalCount);
             updatedMaxValue(Math.max(getMaxValue(), otherHistogram.getMaxValue()));
             updateMinNonZeroValue(Math.min(getMinNonZeroValue(), otherHistogram.getMinNonZeroValue()));
         } else {
             // Arrays are not a direct match, so we can't just stream through and add them.
             // Instead, go through the array and add each non-zero value found at it's proper value:
             for (int i = 0; i < otherHistogram.countsArrayLengthi++) {
                 long otherCount = otherHistogram.getCountAtIndex(i);
                 if (otherCount > 0) {
                     long otherValue = otherHistogram.valueFromIndex(i);
                     if (getCountAtValue(otherValue) < otherCount) {
                         throw new IllegalArgumentException("otherHistogram count (" + otherCount + ") at value " +
                                 otherValue + " is larger than this one's (" + getCountAtValue(otherValue) + ")");
                     }
                     recordValueWithCount(otherValue, -otherCount);
                 }
             }
         }
         // With subtraction, the max and minNonZero values could have changed:
         if ((getCountAtValue(getMaxValue()) <= 0) || getCountAtValue(getMinNonZeroValue()) <= 0) {
             establishInternalTackingValues();
         }
     }

    
Add the contents of another histogram to this one, while correcting the incoming data for coordinated omission.

To compensate for the loss of sampled values when a recorded value is larger than the expected interval between value samples, the values added will include an auto-generated additional series of decreasingly-smaller (down to the expectedIntervalBetweenValueSamples) value records for each count found in the current histogram that is larger than the expectedIntervalBetweenValueSamples. Note: This is a post-recording correction method, as opposed to the at-recording correction method provided by recordValueWithExpectedInterval. The two methods are mutually exclusive, and only one of the two should be be used on a given data set to correct for the same coordinated omission issue. by

See notes in the description of the Histogram calls for an illustration of why this corrective behavior is important.

Parameters:
otherHistogram The other histogram. highestTrackableValue and largestValueWithSingleUnitResolution must match.
expectedIntervalBetweenValueSamples If expectedIntervalBetweenValueSamples is larger than 0, add auto-generated value records as appropriate if value is larger than expectedIntervalBetweenValueSamples
Throws:
java.lang.ArrayIndexOutOfBoundsException (may throw) if values exceed highestTrackableValue
 
     public void addWhileCorrectingForCoordinatedOmission(final AbstractHistogram otherHistogram,
                                                          final long expectedIntervalBetweenValueSamples) {
         final AbstractHistogram toHistogram = this;
 
         for (HistogramIterationValue v : otherHistogram.recordedValues()) {
             toHistogram.recordValueWithCountAndExpectedInterval(v.getValueIteratedTo(),
                     v.getCountAtValueIteratedTo(), expectedIntervalBetweenValueSamples);
         }
     }
 
     //
     //
     //
     // Shifting support:
     //
     //
     //
 
    
Shift recorded values to the left (the equivalent of a << shift operation on all recorded values). The configured integer value range limits and value precision setting will remain unchanged. An java.lang.ArrayIndexOutOfBoundsException will be thrown if any recorded values may be lost as a result of the attempted operation, reflecting an "overflow" conditions. Expect such an overflow exception if the operation would cause the current maxValue to be scaled to a value that is outside of the covered value range.

Parameters:
numberOfBinaryOrdersOfMagnitude The number of binary orders of magnitude to shift by
 
     public void shiftValuesLeft(final int numberOfBinaryOrdersOfMagnitude) {
         if (numberOfBinaryOrdersOfMagnitude < 0) {
             throw new IllegalArgumentException("Cannot shift by a negative number of magnitudes");
         }
 
         if (numberOfBinaryOrdersOfMagnitude == 0) {
             return;
         }
         if (getTotalCount() == getCountAtIndex(0)) {
             // (no need to shift any values if all recorded values are at the 0 value level:)
             return;
         }
 
         final int shiftAmount = numberOfBinaryOrdersOfMagnitude << ;
         int maxValueIndex = countsArrayIndex(getMaxValue());
         // indicate overflow if maxValue is in the range being wrapped:
         if (maxValueIndex >= ( - shiftAmount)) {
             throw new ArrayIndexOutOfBoundsException(
                     "Operation would overflow, would discard recorded value counts");
         }
 
         long maxValueBeforeShift = .getAndSet(this, 0);
         long minNonZeroValueBeforeShift = .getAndSet(this.);
 
         boolean lowestHalfBucketPopulated = (minNonZeroValueBeforeShift < );
 
         // Perform the shift:
         shiftNormalizingIndexByOffset(shiftAmountlowestHalfBucketPopulated);
 
         // adjust min, max:
         updateMinAndMax(maxValueBeforeShift << numberOfBinaryOrdersOfMagnitude);
         if (minNonZeroValueBeforeShift < .) {
             updateMinAndMax(minNonZeroValueBeforeShift << numberOfBinaryOrdersOfMagnitude);
         }
     }
 
     void nonConcurrentNormalizingIndexShift(int shiftAmountboolean lowestHalfBucketPopulated) {
 
         // Save and clear the 0 value count:
         long zeroValueCount = getCountAtIndex(0);
         setCountAtIndex(0, 0);
 
         setNormalizingIndexOffset(getNormalizingIndexOffset() + shiftAmount);
 
         // Deal with lower half bucket if needed:
         if (lowestHalfBucketPopulated) {
             shiftLowestHalfBucketContentsLeft(shiftAmount);
         }
 
         // Restore the 0 value count:
         setCountAtIndex(0, zeroValueCount);
     }
 
     void shiftLowestHalfBucketContentsLeft(int shiftAmount) {
         final int numberOfBinaryOrdersOfMagnitude = shiftAmount >> ;
 
         // The lowest half-bucket (not including the 0 value) is special: unlike all other half
         // buckets, the lowest half bucket values cannot be scaled by simply changing the
         // normalizing offset. Instead, they must be individually re-recorded at the new
         // scale, and cleared from the current one.
         //
         // We know that all half buckets "below" the current lowest one are full of 0s, because
         // we would have overflowed otherwise. So we need to shift the values in the current
         // lowest half bucket into that range (including the current lowest half bucket itself).
         // Iterating up from the lowermost non-zero "from slot" and copying values to the newly
         // scaled "to slot" (and then zeroing the "from slot"), will work in a single pass,
         // because the scale "to slot" index will always be a lower index than its or any
         // preceding non-scaled "from slot" index:
         //
         // (Note that we specifically avoid slot 0, as it is directly handled in the outer case)
 
         for (int fromIndex = 1; fromIndex < fromIndex++) {
             long toValue = valueFromIndex(fromIndex) << numberOfBinaryOrdersOfMagnitude;
             int toIndex = countsArrayIndex(toValue);
             long countAtFromIndex = getCountAtNormalizedIndex(fromIndex);
             setCountAtIndex(toIndexcountAtFromIndex);
             setCountAtNormalizedIndex(fromIndex, 0);
         }
 
         // Note that the above loop only creates O(N) work for histograms that have values in
         // the lowest half-bucket (excluding the 0 value). Histograms that never have values
         // there (e.g. all integer value histograms used as internal storage in DoubleHistograms)
         // will never loop, and their shifts will remain O(1).
     }

    
Shift recorded values to the right (the equivalent of a >> shift operation on all recorded values). The configured integer value range limits and value precision setting will remain unchanged.

Shift right operations that do not underflow are reversible with a shift left operation with no loss of information. An java.lang.ArrayIndexOutOfBoundsException reflecting an "underflow" conditions will be thrown if any recorded values may lose representation accuracy as a result of the attempted shift operation.

For a shift of a single order of magnitude, expect such an underflow exception if any recorded non-zero values up to [numberOfSignificantValueDigits (rounded up to nearest power of 2) multiplied by (2 ^ numberOfBinaryOrdersOfMagnitude) currently exist in the histogram.

Parameters:
numberOfBinaryOrdersOfMagnitude The number of binary orders of magnitude to shift by
 
     public void shiftValuesRight(final int numberOfBinaryOrdersOfMagnitude) {
         if (numberOfBinaryOrdersOfMagnitude < 0) {
             throw new IllegalArgumentException("Cannot shift by a negative number of magnitudes");
         }
 
         if (numberOfBinaryOrdersOfMagnitude == 0) {
             return;
         }
         if (getTotalCount() == getCountAtIndex(0)) {
             // (no need to shift any values if all recorded values are at the 0 value level:)
             return;
         }
 
         final int shiftAmount =  * numberOfBinaryOrdersOfMagnitude;
 
         // indicate underflow if minValue is in the range being shifted from:
         int minNonZeroValueIndex = countsArrayIndex(getMinNonZeroValue());
         // Any shifting into the bottom-most half bucket would represents a loss of accuracy,
         // and a non-reversible operation. Therefore any non-0 value that falls in an
         // index below (shiftAmount + subBucketHalfCount) would represent an underflow:
         if (minNonZeroValueIndex < shiftAmount + ) {
             throw new ArrayIndexOutOfBoundsException(
                     "Operation would underflow and lose precision of already recorded value counts");
         }
 
         // perform shift:
 
         long maxValueBeforeShift = .getAndSet(this, 0);
         long minNonZeroValueBeforeShift = .getAndSet(this.);
 
         // move normalizingIndexOffset
         shiftNormalizingIndexByOffset(-shiftAmountfalse);
 
         // adjust min, max:
         updateMinAndMax(maxValueBeforeShift >> numberOfBinaryOrdersOfMagnitude);
         if (minNonZeroValueBeforeShift < .) {
             updateMinAndMax(minNonZeroValueBeforeShift >> numberOfBinaryOrdersOfMagnitude);
         }
     }
 
     //
     //
     //
     // Comparison support:
     //
     //
     //
 
    
Determine if this histogram is equivalent to another.

Parameters:
other the other histogram to compare to
Returns:
True if this histogram are equivalent with the other.
 
     public boolean equals(final Object other){
         if ( this == other ) {
             return true;
         }
         if ( !(other instanceof AbstractHistogram) ) {
             return false;
         }
         AbstractHistogram that = (AbstractHistogram)other;
         if (( != that.lowestDiscernibleValue) ||
                 ( != that.highestTrackableValue) ||
                 ( != that.numberOfSignificantValueDigits) ||
                 ( != that.integerToDoubleValueConversionRatio)) {
             return false;
         }
         if ( != that.countsArrayLength) {
             return false;
         }
         if (getTotalCount() != that.getTotalCount()) {
             return false;
         }
         for (int i = 0; i < i++) {
             if (getCountAtIndex(i) != that.getCountAtIndex(i)) {
                 return false;
             }
         }
         return true;
     }
 
     //
     //
     //
     // Histogram structure querying support:
     //
     //
     //
 
    
get the configured lowestDiscernibleValue

Returns:
lowestDiscernibleValue
 
     public long getLowestDiscernibleValue() {
         return ;
     }

    
get the configured highestTrackableValue

Returns:
highestTrackableValue
 
     public long getHighestTrackableValue() {
         return ;
     }

    
get the configured numberOfSignificantValueDigits

Returns:
numberOfSignificantValueDigits
 
     public int getNumberOfSignificantValueDigits() {
         return ;
     }

    
Get the size (in value units) of the range of values that are equivalent to the given value within the histogram's resolution. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value The given value
Returns:
The lowest value that is equivalent to the given value within the histogram's resolution.
 
     public long sizeOfEquivalentValueRange(final long value) {
         final int bucketIndex = getBucketIndex(value);
         final int subBucketIndex = getSubBucketIndex(valuebucketIndex);
         long distanceToNextValue =
                 (1L << (  + ((subBucketIndex >= ) ? (bucketIndex + 1) : bucketIndex)));
         return distanceToNextValue;
     }

    
Get the lowest value that is equivalent to the given value within the histogram's resolution. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value The given value
Returns:
The lowest value that is equivalent to the given value within the histogram's resolution.
 
     public long lowestEquivalentValue(final long value) {
         final int bucketIndex = getBucketIndex(value);
         final int subBucketIndex = getSubBucketIndex(valuebucketIndex);
         long thisValueBaseLevel = valueFromIndex(bucketIndexsubBucketIndex);
         return thisValueBaseLevel;
     }

    
Get the highest value that is equivalent to the given value within the histogram's resolution. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value The given value
Returns:
The highest value that is equivalent to the given value within the histogram's resolution.
 
     public long highestEquivalentValue(final long value) {
         return nextNonEquivalentValue(value) - 1;
     }

    
Get a value that lies in the middle (rounded up) of the range of values equivalent the given value. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value The given value
Returns:
The value lies in the middle (rounded up) of the range of values equivalent the given value.
 
     public long medianEquivalentValue(final long value) {
         return (lowestEquivalentValue(value) + (sizeOfEquivalentValueRange(value) >> 1));
     }

    
Get the next value that is not equivalent to the given value within the histogram's resolution. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value The given value
Returns:
The next value that is not equivalent to the given value within the histogram's resolution.
 
     public long nextNonEquivalentValue(final long value) {
         return lowestEquivalentValue(value) + sizeOfEquivalentValueRange(value);
     }

    
Determine if two values are equivalent with the histogram's resolution. Where "equivalent" means that value samples recorded for any two equivalent values are counted in a common total count.

Parameters:
value1 first value to compare
value2 second value to compare
Returns:
True if values are equivalent with the histogram's resolution.
    public boolean valuesAreEquivalent(final long value1final long value2) {
        return (lowestEquivalentValue(value1) == lowestEquivalentValue(value2));
    }

    
Provide a (conservatively high) estimate of the Histogram's total footprint in bytes

Returns:
a (conservatively high) estimate of the Histogram's total footprint in bytes
    public int getEstimatedFootprintInBytes() {
        return _getEstimatedFootprintInBytes();
    }
    //
    //
    //
    // Timestamp support:
    //
    //
    //

    
get the start time stamp [optionally] stored with this histogram

Returns:
the start time stamp [optionally] stored with this histogram
    @Override
    public long getStartTimeStamp() {
        return ;
    }

    
Set the start time stamp value associated with this histogram to a given value.

Parameters:
timeStampMsec the value to set the time stamp to, [by convention] in msec since the epoch.
    @Override
    public void setStartTimeStamp(final long timeStampMsec) {
        this. = timeStampMsec;
    }

    
get the end time stamp [optionally] stored with this histogram

Returns:
the end time stamp [optionally] stored with this histogram
    @Override
    public long getEndTimeStamp() {
        return ;
    }

    
Set the end time stamp value associated with this histogram to a given value.

Parameters:
timeStampMsec the value to set the time stamp to, [by convention] in msec since the epoch.
    @Override
    public void setEndTimeStamp(final long timeStampMsec) {
        this. = timeStampMsec;
    }
    //
    //
    //
    // Histogram Data access support:
    //
    //
    //

    
Get the lowest recorded value level in the histogram. If the histogram has no recorded values, the value returned is undefined.

Returns:
the Min value recorded in the histogram
    public long getMinValue() {
        if ((getCountAtIndex(0) > 0) || (getTotalCount() == 0)) {
            return 0;
        }
        return getMinNonZeroValue();
    }

    
Get the highest recorded value level in the histogram. If the histogram has no recorded values, the value returned is undefined.

Returns:
the Max value recorded in the histogram
    public long getMaxValue() {
        return ( == 0) ? 0 : highestEquivalentValue();
    }

    
Get the lowest recorded non-zero value level in the histogram. If the histogram has no recorded values, the value returned is undefined.

Returns:
the lowest recorded non-zero value level in the histogram
    public long getMinNonZeroValue() {
        return ( == .) ?
                . : lowestEquivalentValue();
    }

    
Get the highest recorded value level in the histogram as a double

Returns:
the Max value recorded in the histogram
    @Override
    public double getMaxValueAsDouble() {
        return getMaxValue();
    }

    
Get the computed mean value of all recorded values in the histogram

Returns:
the mean value (in value units) of the histogram data
    public double getMean() {
        if (getTotalCount() == 0) {
            return 0.0;
        }
        double totalValue = 0;
        while (.hasNext()) {
            HistogramIterationValue iterationValue = .next();
            totalValue += medianEquivalentValue(iterationValue.getValueIteratedTo())
                    * iterationValue.getCountAtValueIteratedTo();
        }
        return (totalValue * 1.0) / getTotalCount();
    }

    
Get the computed standard deviation of all recorded values in the histogram

Returns:
the standard deviation (in value units) of the histogram data
    public double getStdDeviation() {
        if (getTotalCount() == 0) {
            return 0.0;
        }
        final double mean =  getMean();
        double geometric_deviation_total = 0.0;
        while (.hasNext()) {
            HistogramIterationValue iterationValue = .next();
            Double deviation = (medianEquivalentValue(iterationValue.getValueIteratedTo()) * 1.0) - mean;
            geometric_deviation_total += (deviation * deviation) * iterationValue.getCountAddedInThisIterationStep();
        }
        double std_deviation = Math.sqrt(geometric_deviation_total / getTotalCount());
        return std_deviation;
    }

    
Get the value at a given percentile. When the given percentile is > 0.0, the value returned is the value that the given percentage of the overall recorded value entries in the histogram are either smaller than or equivalent to. When the given percentile is 0.0, the value returned is the value that all value entries in the histogram are either larger than or equivalent to.

Note that two values are "equivalent" in this statement if valuesAreEquivalent(long,long) would return true.

Parameters:
percentile The percentile for which to return the associated value
Returns:
The value that the given percentage of the overall recorded value entries in the histogram are either smaller than or equivalent to. When the percentile is 0.0, returns the value that all value entries in the histogram are either larger than or equivalent to.
    public long getValueAtPercentile(final double percentile) {
        final double requestedPercentile = Math.min(percentile, 100.0); // Truncate down to 100%
        long countAtPercentile = (long)(((requestedPercentile / 100.0) * getTotalCount()) + 0.5); // round to nearest
        countAtPercentile = Math.max(countAtPercentile, 1); // Make sure we at least reach the first recorded entry
        long totalToCurrentIndex = 0;
        for (int i = 0; i < i++) {
            totalToCurrentIndex += getCountAtIndex(i);
            if (totalToCurrentIndex >= countAtPercentile) {
                long valueAtIndex = valueFromIndex(i);
                return (percentile == 0.0) ?
                        lowestEquivalentValue(valueAtIndex) :
                        highestEquivalentValue(valueAtIndex);
            }
        }
        return 0;
    }

    
Get the percentile at a given value. The percentile returned is the percentile of values recorded in the histogram that are smaller than or equivalent to the given value.

Note that two values are "equivalent" in this statement if valuesAreEquivalent(long,long) would return true.

Parameters:
value The value for which to return the associated percentile
Returns:
The percentile of values recorded in the histogram that are smaller than or equivalent to the given value.
    public double getPercentileAtOrBelowValue(final long value) {
        if (getTotalCount() == 0) {
            return 100.0;
        }
        final int targetIndex = Math.min(countsArrayIndex(value), ( - 1));
        long totalToCurrentIndex = 0;
        for (int i = 0; i <= targetIndexi++) {
            totalToCurrentIndex += getCountAtIndex(i);
        }
        return (100.0 * totalToCurrentIndex) / getTotalCount();
    }

    
Get the count of recorded values within a range of value levels (inclusive to within the histogram's resolution).

Parameters:
lowValue The lower value bound on the range for which to provide the recorded count. Will be rounded down with lowestEquivalentValue.
highValue The higher value bound on the range for which to provide the recorded count. Will be rounded up with highestEquivalentValue.
Returns:
the total count of values recorded in the histogram within the value range that is >= lowestEquivalentValue(lowValue) and <= highestEquivalentValue(highValue)
    public long getCountBetweenValues(final long lowValuefinal long highValuethrows ArrayIndexOutOfBoundsException {
        final int lowIndex = Math.max(0, countsArrayIndex(lowValue));
        final int highIndex = Math.min(countsArrayIndex(highValue), ( - 1));
        long count = 0;
        for (int i = lowIndex ; i <= highIndexi++) {
            count += getCountAtIndex(i);
        }
        return count;
    }

    
Get the count of recorded values at a specific value (to within the histogram resolution at the value level).

Parameters:
value The value for which to provide the recorded count
Returns:
The total count of values recorded in the histogram within the value range that is >= lowestEquivalentValue(value) and <= highestEquivalentValue(value)
    public long getCountAtValue(final long valuethrows ArrayIndexOutOfBoundsException {
        final int index = Math.min(Math.max(0, countsArrayIndex(value)), ( - 1));
        return getCountAtIndex(index);
    }

    
Provide a means of iterating through histogram values according to percentile levels. The iteration is performed in steps that start at 0% and reduce their distance to 100% according to the percentileTicksPerHalfDistance parameter, ultimately reaching 100% when all recorded histogram values are exhausted.

Parameters:
percentileTicksPerHalfDistance The number of iteration steps per half-distance to 100%.
Returns:
An java.lang.Iterable<HistogramIterationValue> through the histogram using a PercentileIterator
    public Percentiles percentiles(final int percentileTicksPerHalfDistance) {
        return new Percentiles(thispercentileTicksPerHalfDistance);
    }

    
Provide a means of iterating through histogram values using linear steps. The iteration is performed in steps of valueUnitsPerBucket in size, terminating when all recorded histogram values are exhausted.

Parameters:
valueUnitsPerBucket The size (in value units) of the linear buckets to use
Returns:
An java.lang.Iterable<HistogramIterationValue> through the histogram using a LinearIterator
    public LinearBucketValues linearBucketValues(final long valueUnitsPerBucket) {
        return new LinearBucketValues(thisvalueUnitsPerBucket);
    }

    
Provide a means of iterating through histogram values at logarithmically increasing levels. The iteration is performed in steps that start at valueUnitsInFirstBucket and increase exponentially according to logBase, terminating when all recorded histogram values are exhausted.

Parameters:
valueUnitsInFirstBucket The size (in value units) of the first bucket in the iteration
logBase The multiplier by which bucket sizes will grow in each iteration step
Returns:
An java.lang.Iterable<HistogramIterationValue> through the histogram using a LogarithmicIterator
    public LogarithmicBucketValues logarithmicBucketValues(final long valueUnitsInFirstBucketfinal double logBase) {
        return new LogarithmicBucketValues(thisvalueUnitsInFirstBucketlogBase);
    }

    
Provide a means of iterating through all recorded histogram values using the finest granularity steps supported by the underlying representation. The iteration steps through all non-zero recorded value counts, and terminates when all recorded histogram values are exhausted.

Returns:
An java.lang.Iterable<HistogramIterationValue> through the histogram using a RecordedValuesIterator
    public RecordedValues recordedValues() {
        return new RecordedValues(this);
    }

    
Provide a means of iterating through all histogram values using the finest granularity steps supported by the underlying representation. The iteration steps through all possible unit value levels, regardless of whether or not there were recorded values for that value level, and terminates when all recorded histogram values are exhausted.

Returns:
An java.lang.Iterable<HistogramIterationValue> through the histogram using a AllValuesIterator
    public AllValues allValues() {
        return new AllValues(this);
    }
    // Percentile iterator support:

    
    public class Percentiles implements Iterable<HistogramIterationValue> {
        final AbstractHistogram histogram;
        final int percentileTicksPerHalfDistance;
        private Percentiles(final AbstractHistogram histogramfinal int percentileTicksPerHalfDistance) {
            this. = histogram;
            this. = percentileTicksPerHalfDistance;
        }

        
        public Iterator<HistogramIterationValueiterator() {
            return new PercentileIterator();
        }
    }
    // Linear iterator support:

    
An java.lang.Iterable<HistogramIterationValue> through the histogram using a LinearIterator
    public class LinearBucketValues implements Iterable<HistogramIterationValue> {
        final AbstractHistogram histogram;
        final long valueUnitsPerBucket;
        private LinearBucketValues(final AbstractHistogram histogramfinal long valueUnitsPerBucket) {
            this. = histogram;
            this. = valueUnitsPerBucket;
        }

        
        public Iterator<HistogramIterationValueiterator() {
            return new LinearIterator();
        }
    }
    // Logarithmic iterator support:

    
    public class LogarithmicBucketValues implements Iterable<HistogramIterationValue> {
        final AbstractHistogram histogram;
        final long valueUnitsInFirstBucket;
        final double logBase;
        private LogarithmicBucketValues(final AbstractHistogram histogram,
                                        final long valueUnitsInFirstBucketfinal double logBase) {
            this. = histogram;
            this. = valueUnitsInFirstBucket;
            this. = logBase;
        }

        
        public Iterator<HistogramIterationValueiterator() {
            return new LogarithmicIterator();
        }
    }
    // Recorded value iterator support:

    
    public class RecordedValues implements Iterable<HistogramIterationValue> {
        final AbstractHistogram histogram;
        private RecordedValues(final AbstractHistogram histogram) {
            this. = histogram;
        }

        
        public Iterator<HistogramIterationValueiterator() {
            return new RecordedValuesIterator();
        }
    }
    // AllValues iterator support:

    
    public class AllValues implements Iterable<HistogramIterationValue> {
        final AbstractHistogram histogram;
        private AllValues(final AbstractHistogram histogram) {
            this. = histogram;
        }

        
        public Iterator<HistogramIterationValueiterator() {
            return new AllValuesIterator();
        }
    }


    
Produce textual representation of the value distribution of histogram data by percentile. The distribution is output with exponentially increasing resolution, with each exponentially decreasing half-distance containing five (5) percentile reporting tick points.

Parameters:
printStream Stream into which the distribution will be output

outputValueUnitScalingRatio The scaling factor by which to divide histogram recorded values units in output
    public void outputPercentileDistribution(final PrintStream printStream,
                                             final Double outputValueUnitScalingRatio) {
        outputPercentileDistribution(printStream, 5, outputValueUnitScalingRatio);
    }
    //
    //
    //
    // Textual percentile output support:
    //
    //
    //

    
Produce textual representation of the value distribution of histogram data by percentile. The distribution is output with exponentially increasing resolution, with each exponentially decreasing half-distance containing dumpTicksPerHalf percentile reporting tick points.

Parameters:
printStream Stream into which the distribution will be output

percentileTicksPerHalfDistance The number of reporting points per exponentially decreasing half-distance

outputValueUnitScalingRatio The scaling factor by which to divide histogram recorded values units in output
    public void outputPercentileDistribution(final PrintStream printStream,
                                             final int percentileTicksPerHalfDistance,
                                             final Double outputValueUnitScalingRatio) {
        outputPercentileDistribution(printStreampercentileTicksPerHalfDistanceoutputValueUnitScalingRatiofalse);
    }

    
Produce textual representation of the value distribution of histogram data by percentile. The distribution is output with exponentially increasing resolution, with each exponentially decreasing half-distance containing dumpTicksPerHalf percentile reporting tick points.

Parameters:
printStream Stream into which the distribution will be output

percentileTicksPerHalfDistance The number of reporting points per exponentially decreasing half-distance

outputValueUnitScalingRatio The scaling factor by which to divide histogram recorded values units in output
useCsvFormat Output in CSV format if true. Otherwise use plain text form.
    public void outputPercentileDistribution(final PrintStream printStream,
                                             final int percentileTicksPerHalfDistance,
                                             final Double outputValueUnitScalingRatio,
                                             final boolean useCsvFormat) {
        if (useCsvFormat) {
            printStream.format("\"Value\",\"Percentile\",\"TotalCount\",\"1/(1-Percentile)\"\n");
        } else {
            printStream.format("%12s %14s %10s %14s\n\n""Value""Percentile""TotalCount""1/(1-Percentile)");
        }
        PercentileIterator iterator = ;
        iterator.reset(percentileTicksPerHalfDistance);
        String percentileFormatString;
        String lastLinePercentileFormatString;
        if (useCsvFormat) {
            percentileFormatString = "%." +  + "f,%.12f,%d,%.2f\n";
            lastLinePercentileFormatString = "%." +  + "f,%.12f,%d,Infinity\n";
        } else {
            percentileFormatString = "%12." +  + "f %2.12f %10d %14.2f\n";
            lastLinePercentileFormatString = "%12." +  + "f %2.12f %10d\n";
        }
        while (iterator.hasNext()) {
            HistogramIterationValue iterationValue = iterator.next();
            if (iterationValue.getPercentileLevelIteratedTo() != 100.0D) {
                printStream.format(.percentileFormatString,
                        iterationValue.getValueIteratedTo() / outputValueUnitScalingRatio,
                        iterationValue.getPercentileLevelIteratedTo()/100.0D,
                        iterationValue.getTotalCountToThisValue(),
                        1/(1.0D - (iterationValue.getPercentileLevelIteratedTo()/100.0D)) );
            } else {
                printStream.format(.lastLinePercentileFormatString,
                        iterationValue.getValueIteratedTo() / outputValueUnitScalingRatio,
                        iterationValue.getPercentileLevelIteratedTo()/100.0D,
                        iterationValue.getTotalCountToThisValue());
            }
        }
        if (!useCsvFormat) {
            // Calculate and output mean and std. deviation.
            // Note: mean/std. deviation numbers are very often completely irrelevant when
            // data is extremely non-normal in distribution (e.g. in cases of strong multi-modal