Loading...
HomeMy WebLinkAboutDSHW-2008-024869 - 0901a068801e1fca (5) David P. Gosen, P.E. Director, Environmental Services ATK Launch Systems – Promontory P.O. Box 707 Brigham City, UT 84302-0707 RE: Sampling Results for Emission Characterization of Open Burning Waste Propellant Materials – Draft Report ATK Launch Systems - Promontory Facility, EPA I.D. #UTD009081357 Dear Mr. Gosen: The Division of Solid and Hazardous Waste has completed its review of the ODOBi test sampling results and calculation of emission factors. The draft report was submitted to the Division on December 6, 2007. Our comments on the draft report are included with this letter. We recommend holding a meeting to discuss the comments before ATK responds in writing or revises the draft report. In order to allow ATK some time to consider the comments, we request that the meeting be held during the last week in February 2008. Jeff Vandel will contact you to set up a time for the meeting. If you have any questions, please contact Jeff at (801) 538-9413. Sincerely, ORIGINAL DOCUMENT SIGNED BY DENNIS R. DOWNS ON 2/13/08 Dennis R. Downs, Director Utah Division of Solid and Hazardous Waste DRD\JV\tjm c: George Gooch, ATK Launch Systems Blair Palmer, ATK Launch Systems Terry Brown, EPA Region 8 Comments on the “Sampling Results for Emission Characterization of Open Burning Waste Propellant Materials” – Draft Report 1. Section 4.3 - Using Non-Detects (ND) in Emission Factor Calculations. It is stated that if a compound was reported ND in all trials, it was considered ND and emission factors were not calculated. This doesn’t seem to be an appropriate practice for handling all data under this scenario. For example, it is reasonable to assume that NH3 would be detected in the test sample emissions, and it was detected in the PW85-15 and PW65-35 samples. However, since it wasn’t detected in any of the three runs for the PW100 samples, no emission factors were calculated for the PW100 sample. Emission factors should be calculated using the detection limit, or one-half the detection limit, for the “amount detected,” as discussed prior to conducting the ODOBi tests. 2. Section 5.0 – Figure 7 Emission Factors for Particulate Matter. It seems logical that the PW85-15 test sample produced more particulate emissions than the PW100 sample since one would expect the pure propellant sample to burn cleaner. If this assumption is correct, one would expect that the PW65-35 sample would produce more particulate emissions than the PW85-15 sample, which is the opposite of what was observed. Why would the TSP measured go up from PW100 to PW85-15, and then down from PW85-15 to PW65-35? 3. Section 2.0 – Table 4 Sampling Matrix for ATK OB Tests. Why were the sample sizes for PW85-15 and PW65-35 (1.92 lbs) so much smaller than the PW100 sample (4.7 lbs)? 4. Appendix II-L – VOCs Data. Why are the background concentrations for many of the organic compounds so much higher than the results for the three runs of the PW100 sample? Aside from analytical variability, background values should always be about equal or less than the sample results. The practice of subtracting background concentrations from sample results is problematic. For example, the average background values (Table 5, backgrounds two and three) for benzene and ethene, that were used to correct the concentrations of these compounds that were detected in the samples, are greater than all three of the runs for PW100. Emission factors for these compounds were not calculated since the amounts detected were below background. Since this problem occurs a number of times for volatile organics in the PW100 sample, and for some other results (including CO in PW100 and SO2 in PW85-15), it doesn’t appear to be an appropriate practice. This issue is discussed further in Comment #9 below. 5. Appendix II-B – Table 2 Sample Volumes. Are the sample volumes for the background samples a lot greater than the volumes collected for the three test runs due to the longer sampling times that were used for the background samples? Please explain why longer sampling times were used for the background samples. 6. Appendix II-J - Table 2 Cl2 Data. In the footnote to Table 2, it is stated that chlorine was detected in all test samples, including background and blank samples, at similar levels. Therefore, chlorine is being reported as non-detect in all samples and emission factors were not calculated. The evaluation of Cl2 in emissions can’t be dropped simply because there is a problem with the data. It is expected that chlorine would be present in the emissions. Based on the data verification report, it appears that the alkali impingers were contaminated, resulting in similar levels of Cl2 detected in all samples. This data was qualified as unusable in the report. Please explain how the data can then be reported as non-detect for all samples. 7. Appendix II-K – HCN Data. Hydrogen cyanide was detected in Runs 2 and 3 of the PW65-35 sample. Based on the data verification report, the samples where hydrogen cyanide was detected are likely biased low due to a problem in their preparation. What was the methodology for using qualified data when calculating emission factors? Hydrogen cyanide was detected in the PW65-35 sample but not in the PW100 or PW85-15 samples. Please explain these results. Appendix II-C – SF6 Tracer Data and Dilution Factor Calculations. Run One of the PW100 test had a high dilution because of wind during the test. This high dilution resulted in non-detect values for the tracer in three of the four samples. The NOx data was used a surrogate to estimate the dilution factor. For the majority of analytes, Run One had slightly lower emission factors than the subsequent runs which were conducted with a tarp to reduce dilution. The Division recommends that, due to the uncertainty of the dilution factor for Run One and the potential of a low bias in the results, that Run One be rejected for the purpose of calculating emission factors. The average tracer concentration for PW100, Run One is shown in Table 2 as 4.29 ppm, while the average tracer concentrations for Runs Two and Three were approximately 3.3 ppb. It is assumed that the PW100 average tracer concentration is meant to be reported as ppb, not ppm. Please correct the units in the report. 9. Appendix II-M – CEMs Data. As discussed above in Comment #4, there is a problem with how background data was used. When comparing the dilution corrected concentrations for the CO data for the three different test samples, it is inappropriate to use the average of Background Values Two and Three to calculate a “background corrected concentration” for the three runs of the PW100 sample. The concentrations of CO detected in Background Samples Two and Three were significantly higher than those detected in the PW100 sample runs, resulting in “below background” concentrations for all three runs of the PW100 CO samples. The practice of using the average background values for Runs Two and Three to correct the data from Run One shouldn’t be done for any of the ODOBi data. There is no reason to believe that any of the CEMs data should be below background when more material is being burned in the test runs. In addition, background concentrations were commonly higher than the test results for the PW85-15 and PW65-35 samples and there are many discrepancies between the results that were obtained for Background Samples Two and Three. For example, ethane was detected in Background Sample Two (PW85-15) at 0.16 mg/m3, but was not detected in Background Sample Three (PW65-35). The problems with the background data measurements are summarized in the two tables below. It is proposed that background concentrations not be subtracted from concentrations of compounds detected. Emission factors for these compounds could be calculated and then they could be evaluated in the risk assessment. Background Higher Than Test Background Discrepancies 85:15 65:35 ND As 2,4-DNT ND Carbazole ND Naphthalene ND Phenol ND 1,3-butadiene ND ND 1-butene ND acetone Carbon disulfide ND Ethane ND ND M,p-xylene Tetrahydrofuran ND 100 85:15 65:35 As Acetonitrile Acetonitrile Acetylene Acetylene Acrylonitrile Acrylonitrile Carbon disulfide Tetrahydrofuran HCl (negative) NOx NOx SO2 SO2 SO2 2,4-DNT Carbazole Napthalene Phenol Formaldehyde Toluene CO Ethane Ethene Benzene Acetone TNMOC 10. Appendix II-O – Dioxins/Furans-A. The standard volumes (scf) measured for PW100, Runs One through Three are about one-half of the volumes measured for the PW85-15 and PW65-35 samples. Apparently this has resulted in the low amounts (pg) of dioxins/furans that were detected compared to the amounts detected in the other two samples. Please explain why the sample volumes that were collected for the PW100 runs are so much smaller? 11. Section 5.2 Emission Factor Results Summary. Emission factors should be calculated separately for each chlorinated dioxin or furan congener with a USEPA toxicity equivalence factor. The different homologues have different fate and transport characteristics and should be modeled separately. Analytical Data Comments 1. The perchlorate method for air sampling has not been validated. The laboratory did not verify the extraction method for the filter, i.e., a filter was not spiked and extracted. The non-detect values for perchlorate conflict with the observations that perchlorate is emitted during open burning at ATK. Did the data validation include a review of the perchlorate data? The data validation appears to have been limited to the particulate data only. The report should include a discussion of the uncertainties associated with the attempted measurements of perchlorate emissions. 2. Several volatile organic compounds were listed in the QAPP but were not reported by the laboratory. Please explain why the following compounds were omitted and provide a discussion of the significance of the data gaps: Alpha-pinene Beta-pinene Butanal D-Liminonene Napthalene N-Butanol Vinyl Acetate 3. The Hydrogen Cyanide data appears unusable. The laboratory reported that the sodium hydroxide reagent was prepared incorrectly resulting in low recoveries. Did the laboratory run the method-required quality control for NIOSH 6010 (Analyze three quality control blind spikes and three analyst spikes to ensure the calibration graph and DE are in control)? 4. The laboratory indicated that the PM10/PM2.5 filter for sample DPG 9-2-1 was broken into pieces (p. 136/663 in VIII-C). Please explain why this data would be usable. 5. The blank results should not be subtracted because the method blank (J7JHA1 June 20, 2007) was not analyzed with the sample batch (6167066). 6. Per method 7470 for mercury, “6.4 Nonaqueous samples shall be refrigerated when possible, and analyzed as soon as possible.” Table 3.1 of SW846 specifies that mercury samples must be stored at less than 6( C. Mercury samples were received at the laboratory at ambient temperature five days after sample collection. Please explain why the mercury data should not be rejected. 7. Please investigate and explain why the ICP reporting limit standard, analyzed on June 20, 2006, does not match the reporting limits for the data (Total Metals ATK-1-BKGA-TSP-ACERNS/FILT). 8. Why is there a large discrepancy between the toluene concentrations reported from the SUMA canister compared to the SVOC TIC measurement of 860µg (ATK-1-2A-SVOC/XAD/MECLRNS)? 9. The TO-11A samples were not properly preserved when received by the laboratory. The data should be qualified as potentially biased low. 10. Please describe the modifications to test methods TO11 and TO14 and provide the validation data for the modifications. 1 February 13, 2008 TN200800131 288 North 1460 West • PO Box 144880 • Salt Lake City, UT 84114-4880 • phone (801) 538-6170 • fax (801) 538-6715 T.D.D. (801) 536-4414 • www.deq.utah.gov State of Utah Department of Environmental Quality Richard W. Sprott Executive Director DIVISION OF SOLID AND HAZARDOUS WASTE Dennis R. Downs Director JON M. HUNTSMAN, JR. Governor GARY HERBERT Lieutenant Governor