Watershed Monitoring Instrumentation
PRODUCTS
Dataloggers
Datalogger Sensors
Water Quality Meters
_Multiparameter
__Turbidity
__D/O Meters
__Conductivity
__PH Meters
Flow & Current Meters
Sampling Instruments
Ground Water Instruments
GPS
   
 

Combination Multiparameter Meters

Hanna HI 9828 Handheld Multiparameter Meter

Specifications

Range

pH

0.00 to 14.00 pH

Resolution

pH

0.01 pH

Accuracy

pH

±0.02 pH

Range

mV

±600.0 mV

Resolution

mV

0.1 mV

Accuracy

mV

±0.5 mV

Range

mV

±2000.0 mV

Resolution

mV

0.1 mV

Accuracy

mV

±1.0 mV

Range

DO

0.0 to 500.0%; 0.00 to 50.00 mg/L

Resolution

DO

0.1% / 0.01 mg/L

Accuracy

DO

0.0 to 300.0%: ±1.5% of reading or ±1.0% whichever is greater;
300.0 to 500.0%: ±3% of reading;
0.00 to 30.00 mg/L: ±1.5% of reading or 0.10 mg/L whichever is greater;
30.00 mg/L to 50.00 mg/L: ±3% of reading

Range

EC

0.000 to 200.000 mS/cm (actual EC up to 400 mS/cm)

Resolution

EC

Manual: 1 μS/cm; 0.001 mS/cm; 0.01 mS/cm; 0.1 mS/cm; 1 mS/cm
Automatic: 1 μS/cm from 0 to 9999 μS/cm; 0.01 mS/cm from 10.00 to 99.99 mS/cm ; 0.1 mS/cm from 100.0 to 400.0 mS/cm
Automatic mS/cm: 0.001 mS/cm from 0.000 to 9.999 mS/cm; 0.01 mS/cm from 10.00 to 99.99 mS/cm; 0.1 mS/cm from 100.0 to 400.0 mS/cm

Accuracy

EC

±1% of reading or ±1 μS/cm whichever is greater

Range

Resistivity

0 to 999999 Ω•cm; 0 to 1000.0 kΩ•cm; 0 to 1.0000 MΩ•cm

Resolution

Resistivity

Dependent on resistivity reading

Range

TDS

0 to 400000 mg/L or ppm (the maximum value depends on the TDS factor)

Resolution

TDS

Manual: 1 mg/L (ppm); 0.001 g/L (ppt); 0.01g/L (ppt); 0.1 g/L (ppt); 1 g/L (ppt)
Auto-range scales: 1 mg/L (ppm) from 0 to 9999 mg/L (ppm); 0.01 g/L (ppt) from 10.00 to 99.99 g/L (ppt); 0.1 g/L (ppt) from 100.0 to 400.0 g/L (ppt)
Auto-range g/L (ppt) scales: 0.001 g/L (ppt) from 0.000 to 9.999 g/L (ppt); 0.01 g/L (ppt) from 10.00 to 99.99 g/L (ppt); 0.1 g/L (ppt) from 100.0 to 400.0 g/L (ppt)

Accuracy

TDS

±1% of reading or ±1 mg/L

Range

Salinity

0.00 to 70.00 PSU (extended Practical Salinity Scale)

Resolution

Salinity

0.01 PSU

Accuracy

Salinity

±2% of reading or 0.01 PSU whichever is greater

Range

Seawater Specific Gravity

0.0 to 50.0 σt, σ0, σ15

Resolution

Seawater Specific Gravity

0.1 σt, σ0, σ15

Accuracy

Seawater Specific Gravity

±1 σt, σ0, σ15

Range

Atm. Pressure

450 to 850 mmHg; 17.72 to 33.46 inHg; 600.0 to 1133.2 mbar; 8.702 to 16.436 psi; 0.5921 to 1.1184 atm; 60.00 to 113.32 kPa

Resolution

Atm. Pressure

0.1 mmHg; 0.01 inHg; 0.1 mbar; 0.001 psi; 0.0001 atm; 0.01 kPa

Accuracy

Atm. Pressure

±3 mmHg within ±15°C from the temperature during calibration

Range

Temperature

-5.00 to 55.00°C; 23.00 to 131.00°F; 268.15 to 328.15K

Resolution

Temperature

0.01°C; 0.01°F; 0.01K

Accuracy

Temperature

±0.15°C; ±0.27°F; ±0.15K

Calibration

pH

Automatic 1, 2, or 3 points with 5 memorized standard buffers (pH 4.01, 6.86, 7.01, 9.18, 10.01) or 1 custom buffer

Calibration

mV

Automatic at 1 custom point

Calibration

Conductivity, Salinity

Automatic 1 point with 6 memorized standards (84 μS/cm, 1413 μS/cm, 5.00 mS/cm, 12.88 mS/cm, 80.0 mS/cm, 111.8 mS/cm) or custom point

Calibration

DO

Automatic 1 or 2 points at 0, 100% or 1 custom point

Calibration

Resistivity, TDS

Based on conductivity or salinity calibration

Calibration

Atm. Pressure

Automatic at 1 custom point

Calibration

Temperature

Automatic at 1 custom point

Temperature Compensation

Automatic from -5 to 55°C (23 to 131°F)

Logging Memory

Up to 60000 samples with 13 measurements each

Logging Interval

1 second to 3 hours

PC Connection

USB (with HI 92000 software)

Waterproof Protection

Meter IP67, Probe IP68

Environment

0 to 50°C (32 to 122°F); RH 100%

Power Supply

(4) 1.5V alkaline C cells (approx. 150 hours of continuous use without backlight)/ (4) 1.2V rechargeable C cells (approx. 70 hours of continuous use without backlight)

Dimensions

Meter: 221 x 115 x 55 mm (8.7 x 4.5 x 2.2”); Probe: 270 x 46 mm DIA (10.6 x 1.8” DIA)

Weight

Meter: 750g (26.5 oz.); Probe: 750g (26.5 oz.)


Copyright © 2001 Geo Scientific Inc., All Rights Reserved
Home | Contact Us