From s_levy@ameritech.net Sun Aug 15 18:19:52 2004 From: s_levy@ameritech.net (Stuart Levy) Newsgroups: sci.astro.amateur Subject: Re: night sky meter project needs SX programmer Reply-To: slevy@new.math.uiuc.edu In article <4106988E.CBDC990D@as.arizona.edu>, Dan McKenna wrote: [in response to Stuart's amazement] >Yes, no problem at 21.5 sq arc second >the resolution is better than 0.1 Mag > >The current to voltage converter on the TSL237 is 2.5*10^-12 amp/Hz >and the photodiode is about .93 by .93 mm and so it is quite sensitive. >> Is it very sensitive to temperature? How do you calibrate a detector >> like that, e.g. how could you make a stable dim comparison source? >The detector has a dark frequency temperature coefficient and so the >zero will drift. I have used temperature compensation requiring a >second channel which seems to work. I am now using the TSL237s and >find that it is more stable and lower noise than the TSL320R This is exciting. Though I'm not an embedded-systems builder, I used to build little electronic gadgets and do assembly-language programming. I've been thinking how you might put together a system like this. Probably you've already thought it through, but anyway here's what I have in mind: For calibration, it'd be good to have both a dark source (a shutter, say a black bag to hide the detector in) to measure the dark current, and a known light source, ideally an external one, to compensate for detector sensitivity changes (if any) and for optical misalignment, dust, haze, etc. An isolated bright star, like Arcturus, Vega, Polaris, maybe Aldebaran, etc. might make a better test source than one you could build. You might only need to do the bright-source calibration occasionally, just to establish an absolute sensitivity measure so that results from different devices could be compared. It'd be nice too to include both a few-digit 7-segment LED display, for manual readings, and a computer interface -- a serial port seems simplest -- for easy automated measurement recording. So in operation you'd do a dark-current measurement (put bag over optics, press Dark button on device), then point at mostly-blank sky, and if doing full calibration also point it at a bright star near the blank sky. It seems as though the optical field might need to be surprisingly small to let bright stars stand out well. If I'm calculating right, a magnitude-0 star would only amount to mag 21.0/square arc sec of additional brightness, if spread over a 5-degree field! The microcontroller should provide at least *two* counter/timers. One would be driven by the internal clock (best using an external crystal) to provide an absolute timebase, the other clocked by the photodiode. Doing it this way, the microcontroller can poll the counters at its leisure to measure frequency, and do other things like driving the serial port or display, rather than having to catch every transition of the photodiode clock. It looks as though the SX series controllers have only one counter-timer, so that might not be the best choice, but lots of other inexpensive devices have more. Poking around on the web, Atmel's AT89S8252 seems handy -- cheap (~$7), compatible with the ~5V power supply which the TSL237 needs, has two counters, a UART, and lots of I/O pins for reading buttons and driving the display, includes nonvolatile memory which the chip itself can rewrite to save calibration data, and comes in a DIP package suitable for wirewrap prototyping. It might have about four switches on it. Besides the power switch: "Dark" button, pressed for dark calibration "Mode" button: show output in different forms, e.g. mag/sq arc sec or raw counts or maybe others "Calibrate" switch: changes the meaning of the two buttons. When calibrate-ing, you'd point at some not-very-starry sky and press "Dark" to latch that amount of light as the "sky background". The display would show, instead of the mag/sq arc sec sky brightness, the *excess* integrated brightness (in magnitudes) above the "sky background". Point at a nearby bright star to "measure" its brightness. If you knew the actual total brightness of the stars in the detector's field, pressing Mode would cycle through a series of assumed brightness values; when that reached the actual value for the star(s) in view, holding down Mode and Dark together could save the current calibration in nonvolatile memory. If it worked this way, you could also use "Calibrate" to roughly measure the detector's angular field by sweeping it slowly across a bright isolated star. So, the whole package might include: the magical photodiode and optics; a prototype box with 6V or 9V battery pack and a few square inches of perf board; connectors and a 3-pin cable so that the photodiode can be mounted remotely, out of view of the LED display; a small 4-digit 7-segment LED display (for example the LTC-4727JR, a 4-digit high-efficiency red display in 16-pin DIP package, should do); some way to pull up the four display anodes with a few milliamps, a quad buffer or CD4016 switch or similar; a resistor pack (to limit display current); the microcontroller (twelve of the microcontroller's I/O pins should be able to drive the display; with luck the ~3ma-to-ground capacity of its I/O pins is enough for the display cathodes); an RS-232 driver chip, and DB-9 or similar connector; a few wirewrap sockets (40-pin for the microcontroller, and four more of 14 or 16 pins for the display, display driver, resistor pack, and RS-232 driver); a 5V voltage regulator; two slide switches ("On/Off" and "Calibrate") two pushbuttons ("Dark" and "Mode") I.e. not a whole lot of hardware. It sounds doable to me, and would probably total under $100 even if the lens and filter cost $40 of that. It's been fun thinking about this... Stuart in sunny Champaign, IL