A discussion with Robin Glover about SharpCap and the Electronic Revolution in Amateur Astronomy

“I discovered it hiding in the software itself!”

Robin Glover regarding the origin of the EAA capabilities in his software, SharpCap, which are now transforming amateur astronomy

 

This is an edited version of a conversation I recently had with Robin Glover, the developer of SharpCap and SharpCap Pro (http://www.sharpcap.co.uk).    The purpose of our discussion was to generate this write-up to share with the readers of AAA.org’s Eyepiece and the Westchester Amateur Astronomers newsletter, SkyWAAtch.

Figure 1 Robin Glover (left) and Mauri Rosenthal Skyping

Mauri:  Let’s start by hearing your take on the extent to which the combination of the newer CMOS cameras and your software, SharpCap, has been kind of revolutionary, making astrophotography very different today than it was just three years ago before those products really came on board.  So tell us, do you see it as revolutionary?

Robin: I think that’s a very interesting question… CMOS has changed an awful lot of things and is still changing.  When I first got into Astrophotography, maybe eight or nine years ago when people were taking webcams to places and using them to take lunar and planetary images, CMOS was very much the poor cousin of the CCD images.  But since then there’s been this fantastic investment by companies all over the world in CMOS technology because those cameras go into mobile phones. They go into cars that detect pedestrians in front of them and they go into industrial processes.   This vast amount of money has been spent not really for the benefit of amateur astronomers at all. But the CMOS sensors vastly improved over the ones we were looking at ten years ago.  They have much less noise and are much more sensitive.  Some companies have now been taking those sensors and putting them into astronomy cameras for us, and we can take advantage of all of those advances in CMOS technology. That puts us in a great place to use some much lower priced cameras than we could see some years ago.

But what’s very interesting about CMOS is that they challenge some of the long-held beliefs about astrophotography.  For a long time people have been getting fabulous CCD images by using very long sub exposures, by stacking up the sub exposures of 10-15 minutes, and there’s a very good reason for people doing that with CCD cameras.  It’s kind of become ingrained in the astrophotography mindset that you have to use those long sub exposures and people have perhaps forgotten the science and the reasons behind it for CCD cameras.  It’s tricky really because our own intuition says that when we use a photography camera outside, we take a short exposure, and when we use it inside where the lights are dim we take a longer exposure.  So it’s kind of logical to us that when we try and take photos of the night sky, where it’s really, really dark, we take really, really long exposures.   Further, when we’re taking images of the night sky we start to stack exposures — which you don’t do when you’re taking images of your friend’s wedding or your kid’s birthday — and that changes things.  So your intuition about length of exposure from photography doesn’t really apply to this question:  if I’m going to be taking an hour’s images tonight of M42, would I be better off taking six ten-minute images or sixty one-minute images?

Now when you dig into the mathematics of this you discover that the important thing for how long an image you should expose is the read noise of the sensor.   CCD sensors typically have a read noise of seven or eight electrons, so that means every time we read an image off one of those sensors there’s an error added to every single pixel that is equivalent to seven or eight electrons of noise.  In certain CMOS sensors, depending on the gain you’ve set, you can have read noise between one and two electrons — so that’s an awful lot lower.  It turns out that it’s important to try and keep the contribution of read noise down because you don’t want to be paying the price of that noise for every single frame.  If you took sixty one-minute frames you’re paying that eight electrons cost of noise every single frame: if you’ve got a CMOS camera that noise is much less of a cost to pay every frame.   When you work through the math you discover that four times roughly reduction in read noise on a CMOS sensor means that you can get away with reducing your sub exposure time by a factor of sixteen.  So suddenly, you look at a case where instead of taking from fifteen minutes up to one hour of imaging you get similar results with the CMOS sensor by taking one-minute subs. Now one-minute subs are very different from 15-minute subs because you can start asking yourself questions like: Hey, how bad could my tracking be if I’m only going to be taking a one-minute sub?   What happens if an airplane comes across? Hey cool! I only lose one sub out of sixty and not one out of four. Does it matter if I’m using an Alt/Az mount? I mean it’s going to field rotate a little bit during a minute but maybe that’s not going to be noticeable in a one-minute exposure. Once we’ve asked all of those questions maybe we discover that by letting the software do the fix-ups for slight drifts in your mount tracking or rotation you’re suddenly making astronomy and astrophotography a much more accessible subject.

Fig 2 screengrab of SharpCap Pro

M: So Robin, to me it’s interesting that your illustration is using one minute exposures with CMOS cameras.  I’ve gone maybe to a crazy extreme which is that I am stacking 4 second or 8 second exposures which have a lot of the benefits that you just rattled off.  I get away with terrible tracking using fairly flimsy mounts on lightweight tripods that contribute to my ability to do what I call ultra-portable urban astrophotography.  It actually transforms my work dramatically from just trying to get five to ten minute subs into getting ten minute stacks in SharpCap by using literally four or eight second exposures.  Did you have that in mind?  Because in my mind that’s where it’s truly revolutionary.

R:  So sure, there are plenty of people who use incredibly short exposures – there are fabulous pictures you can find out there of galaxies by people who’ve done one second exposures and that’s pretty much taking it to an extreme.  There’s a sweet spot in the middle where you’re not stretching your mount too far but you’re getting the best you can out of your camera.  The position of that sweet spot depends on the actual sensor in your camera. It depends also on what gain you set and it depends on the brightness of your sky.  Actually, that sweet spot is a lower exposure for astronomers who are in highly light polluted areas where they’ve got a strong sky background, and is a longer exposure if you are lucky enough to live out in some nice dark area.

It’s actually quite complicated to calculate that accurately.  This is one of the things that I’ve built into the recent versions of SharpCap Pro.  There’s a tool called the Smart Histogram and it comes in two stages. First of all it guides you through measuring the characteristics of your sensor.  For a lot of the common ones, they’re built in so you don’t have to do that, but if you’ve got a new camera you may have to measure it and that creates a data file on your computer that SharpCap can use every time you open your camera.  Thus it knows the read noise and a lot of other characteristics of your camera, and it means that when you open up the histogram you get some guidelines as to what exposure you need to take.  To use it to its full extent you actually tell SharpCap to take a measurement of the background sky brightness in your area.  Of course that might vary — the moon’s coming around again, you’ll have a much brighter sky tonight than you would on a new moon evening.  Once SharpCap knows about the sensor in your camera and it knows about the sky brightness it can actually run through all those complicated calculations that I’ve alluded to without you having to worry about them at all.  It will say, hey, you’ll get best results by setting a gain of 220 and using 16 second exposures.

M: I might still go through all of that and still think that my tracking is really only going to allow for 8 second exposures so I’ll trade-off shorter exposures for more noise because of higher gain.

R: That’s exactly one of the things you can do if you think you’re tracking only runs two to five seconds you can just put a bounds on it and say I don’t want myself exposures to be longer than that and it will then find you the best combination subject to that limitation.

M: Great!

R: It’s that fabulous ability to take short exposures and not worry about your tracking, and then just see an image grow on the screen all at once.  I mean most of us are impatient people.  You know it’s hard when you go out and do astrophotography the old way because you really don’t see very much on the screen to begin with.  You take all of these sub exposures; you save them to disk; you come back indoors and then you stack them; and sometime later you finally start to see an image.  But it’s much more involving to see that image grow in front of you on the screen as each 4-second or 15-second exposure is added to the stack.

M: Exactly, and that and that brings us to the other revolution which is EAA (Electronic Assisted Astronomy).  I use this regularly at outreach in New York.  SharpCap live stacking and those short exposures with filtered optics enable me to show some deep sky stuff in real time to people at outreach events.  Now I think the epitome of EAA is to get an almost instantaneous view on a screen and mine build a little bit more slowly than that, but my sense is that this is something that simply couldn’t have been done five years ago at all — or it could have been done in a very kludgy fashion.  But you’ve enabled people to use astrophotography equipment plus any kind of optics and mount combination and really effectively dramatically improve the reach for a casual night of visual observing (now electronically) rather than for creating a processed astro-image.  Was that part of what you had in mind or is that just a side benefit that we get?

R: SharpCap has largely evolved as I’ve become interested in different aspects of astronomy.   So it all started about eight or nine years ago with webcams and trying to photograph the moon and the planets with those and realizing just how dreadful the software was that you had to use at the time. I’d be spending an evening outside with the laptop and gloves on in the cold trying to focus on Jupiter and make the laptop do what it was supposed to but making so many mistakes because the software was fighting against me too.  So SharpCap started off with a focus on lunar and planetary imaging via webcams, and then as I’ve become interested in different parts of astronomy different features have been added to SharpCap.  About three or four years ago we started getting EAA tools appearing.  Often in those days they were tied to a particular brand of camera.  Some of the results that were being got were particularly impressive and I realized that I already had a piece of software that did at least one of the hard tasks, which was talking up to five or six different brands of cameras quite reliably. Cameras that could easily do these longer exposures, the 5 or 10 or 20 seconds required for EAA.  The only part of the problem left to solve was adding up the frames — aligning them and adding them up — and that was not a desperately hard part of the solution.  It was like a light bulb moment for me: all I’ve got to do is add this alignment feature and suddenly SharpCap can be an EAA tool.  So it was almost as if I discovered it hiding in the software itself.  It was there, almost ready to be done and I thought “Wow I can do this!”  So it was literally two or three weeks of work over the summer of 2015 to go from no EAA to having a workable live stack in SharpCap and being outside testing it out under the stars.  It’s been fabulously popular since then.

Figure 3 Author’s image of IC1805, Heart Nebula, using SharpCap Pro techniques discussed in article including 8 second exposures, dithering, and flat and dark subtraction

M: Yeah, well I think it’s remarkable.  I think you’re understating it.  When using it in outreach I will describe what’s happening.  I’ll say that your brilliant software (because people are looking at SharpCap on my laptop which has a tablet configuration so they’re looking at the entire interface along with the image display) every 4 seconds or 8 seconds is taking an image off the camera; it’s doing a flat subtraction; it’s doing a dark subtraction; and then it’s registering the stars and averaging into the stack – and it’s doing this in 1.3 seconds.  I think you clearly did something extremely creative to get that on-the-fly stacking.  Did you use existing algorithms or did you actually come up with a completely new algorithm for such fast stacking?

R: So the stacking breaks down into sort of three or four different stages.  First of all you’ve got to find some stars and you know I use an existing algorithm for that.  That one’s been well solved and there’s no point in going out there and re-writing it.  Then you’ve got to match up the patterns of stars in the stack with the patterns of stars in the latest frame that have come in so you can work out the alignment.  That one I wrote myself.  There are algorithms out there to do it but it was a nice interesting challenge to write one myself that was definitely going to be fast enough.  It’s a little bit like plate solving.  Then finally you’ve got to transform the frame so maybe move it 3 pixels to the left, 2 down and rotate it 1.2 degrees.  Again this is already well solved in software and there’s no need to reinvent that wheel.  And then it’s just adding it on which is pretty much simple once you’ve done all those slightly harder bits of finding the stars, aligning, and transforming.

M: Well I am very grateful for your figuring that out because it’s given me a very cool hobby!

R: It certainly brought more targets within reach of outreach because you know before you’d have been showing the moon, you’d have been showing any planets that happen to be within reach, but for the deep sky stuff as you say you would have been constrained by time to get anything meaningful before somebody who’s never seen astronomy perhaps loses interest and wanders away.

M:  You’ve already covered this in part of your answer but I just want to confirm that by way of background you have a day job and you’re a software developer and in effect you kind of wanted to solve these problems for your own account and that’s what led you down that path of building the software.  Is that a fair summary?

R: Yeah pretty much. You know I’ve been a software developer for many years; I have a day job where I write software in a completely different area.  My software ends up being used by law firms and it’s completely different from SharpCap but both of them are interesting.   But it was just struggling to use the capture tools that were available several years ago and I think you know people who were into the scene then will remember things like AmCap and WXAstroCapture which were just trickier to use and I used to go outside and try and capture the moon and I take one video and then press snapshot and capture or capture again and I’d realize I just wiped over my previous file with a new one.  It was little things like that that made me realize that this software that we had was not designed to be used by people struggling with focusing and tracking at the same time, using a laptop in the dark.  It needed to be something that was more foolproof.

M: Yeah I think there’s a lot of that in amateur astrophotography.  I was very impressed when I first started tracking with PHD to read Craig Stark’s description of very similar phenomena — that he was so tired of being eaten alive by mosquitoes that he felt a need to use his programming skills to build a more robust and easier to use tool to get tracking up and running on any computer with any guide camera and so on.  So there’s a big parallel. I don’t know if you’ve met him…

R: No, I’ve never met him, but we talked a couple of times over email – he’s a great guy.

M: …but I think there are two amazing stories of two guys who we’re all very grateful to for providing these tools.   Let’s move on to one last topic which is the future.  What do you envision as the next best things?  I can give you my wish list for what I’d like to see next in SharpCap, some of which are probably impossible.  Things like, can you image through clouds, please?

R: (Laughs) No! That cloud dispersal feature never works! I keep trying to write one and it never works properly either.

M: Well fix that and you’ll get a Nobel Prize.  But what do you see as the most important next steps and then I’ll bounce my ideas off you.

R: Things that I’m definitely aiming to improve:  One thing is I want to have a sequencer in SharpCap so that people who do more complicated imaging runs, perhaps with filter wheels and so on, could build a sequence to take a hundred frames, change the filter wheel to the red position, take another hundred, move the filter wheel to the blue position etc. etc. etc.  This is a fairly frequently asked for feature by the power users of SharpCap so that’s one way that I want to expand things.

Another is I’m always looking into ways that the image can be improved by the software.  The CMOS cameras that we use now are not designed with astrophotography in mind — they’re designed for industrial processes and small format cameras.   So the guys who build them, don’t go out and take two or four minute exposures in in the dark and then stretch them almost to the point that they’re ripping apart the levels to see what the output looks like, because that’s not what the cameras are designed for.  But that’s what we astrophotographers do, and so we typically expose flaws in the cameras and some of these flaws we correct with dark subtraction, like amp glow.  Another flaw that’s become prevalent in some of these CMOS cameras is that you see a slight variation in brightness between different lines in the image creating a sort of horizontal banding in the image.  The latest version of sharp cap has a tool that’s helped to suppress that in the individual frames as they’re captured so it looks for that sort of pattern in the image.  If you’ve turned this tool on it tries to wipe it out for you.  So I’m looking at ways that the software can find these individual flaws and then seeing if I can write something that will correct them and hopefully correct them without doing any other damage to the image – like reducing the sharpness or clarity of the image if at all possible.

Figure 4 Detail from astrophoto emphasizing “walking noise”.  Robin’s suggestion for dithering via the latest version of SharpCap Pro enabled me to eliminate this problem on my subsequent attempt.

M:  Those sound great!

 

R: Besides the horizontal banding there are other things.  If I work out a way to remove the raining noise or walking noise that is common in images where you see what looks like the streaks of raindrops on a window, running diagonally.

M: My Heart Nebula image from last night is filled with that –

R:  Yeah so that’s quite a common issue, caused by dark subtraction not being entirely perfect.  You know dithering can help with that sort of thing and the latest SharpCap has dithering in it via PhD but if I can find a way to do that in software, I’d love to be able to do something about that.  So that’s another sort of future direction that SharpCap might help.

Figure 5 Similar detail to Figure 4, captured using new features in SharpCap Pro to eliminate “walking” noise

M: Okay though we should wrap this up so that you can go and write code because those sound extremely valuable!  Let me just tell you the few things that I thought of.  One is that people seem to love your polar alignment utility.  I can never use it because I never have a view to the north whether I’m at home or in a lot of the outreach that I’ve done.  Is it even possible to do a polar alignment utility with only a view, say to the south?  Or the big win would be anywhere in the sky?  Or do I really have to have the North Celestial Pole?

R: It’s actually unfortunately hard to do it without the view to the north.  One of the reasons that SharpCap’s polar alignment is successful is it doesn’t use GOTOs.  Every time you do a GOTO that doesn’t land bang on the position you expect — if it’s a few arc seconds out, or maybe an arc minute out, then those imperfect GOTOs that you’re doing on consumer-grade mounts mean that the results can be very variable.  You can use the same tool two or three times in a row without making any adjustments to your mount  and get different results, so I avoided that in SharpCap’s polar alignment because I didn’t like the inaccurate results.  Another thing that you can do is something like drift alignment but where you move the mount in RA to speed up the drift.  There are some tools that do this, and they’re quite handy and I tried some experiments about that…But the Meridian flip throws that out because you can only do ninety degrees or so before you have to flip, and then on the other side of the flip you don’t know that your measurements are valid.   So I have yet to find a way to make that work although I have spent — as you can see — some time trying to work out a way.  I’m afraid SharpCap Polar Alignment is “if you see the pole only” at the moment so I’m sorry about that.

M:  I understand because it seems that if it was easy to do somebody would have figured it out by now.  A simple thing:   For those outreach settings, a one touch full screen display button – that would be cool.

R: There’s a new option in SharpCap 3.2 where you can use two monitors.  If you have a second monitor plugged in (which might be less easy perhaps in outreach events) then you can push the actual display of the image onto the second monitor — and that’s full screen on that monitor.  Then all your controls for live stacking stay on your main monitor or your laptop, so you can be adjusting the settings and everybody can see a nice big picture of the of the image.  But yes, something to go full screen for the full image, I can take that on board — that makes sense.

M: Okay.  Well let me go for just one last question, the big future:  Do you do you think there will be a role for effectively completely automating systems?  I’m thinking that it took me a while to get up all these different learning curves: for navigating the sky; learning how to track; learning how to image; learning how to process.  I’m aware of a couple of ventures that aim to get, let’s call it EAA, in a very user friendly configuration.  I don’t know how much fun it will be for people to be able to just sort of press a button on the telescope and then have magically appear an image of M33 or some colorful nebula.  Do you think there will be a role for almost push-button astrophotography?

R: I think there’s already that sort of astrophotography happening but it’s happening exactly at the opposite end of the price spectrum by people who perhaps live somewhere like Britain, where the weather is dreadful on average, and want to do more imaging than they get to do during British night times.  So they may rent an astronomy site, typically in Spain, which is quite common for people who do that sort of thing here.  They put their equipment there and they have a remote access to a computer there that runs the equipment.  But it’s very much at the other end of the price range with very expensive mounts and very expensive kit because in order to make that sort of thing that you’re describing work — the fully push-button automated stuff — it has to be so very reliable.  Because if you want to do the push-button thing but it keeps falling over because something again tonight is not quite working and it’s a different thing from last night, it’s going to get people very frustrated.  Maybe we get to the point that the price that you have to pay for the reliability is too much.  Buying a camera from a manufacturer who has been through all the hoops to make sure that it never loses contact with your computer and is a hundred percent reliable is not going to be as cheap as buying your camera from a manufacturer who just wants to ship a camera that works.  They accept the fact that maybe once in a night’s imaging it might have a bit of a wobble and you unplug it and plug it back in again.  There the quality implies cost.  Sadly I don’t know if that one will fly at this end.

M: I also think of the user experience.  A lot of people who are doing this love solving the problems – it’s part of the challenge.

R: It’s actually doing it that’s interesting!

M: So I’ll leave you with this one last question:  if the weather was good tonight would you rather be taking pictures or writing code?

R: (laughing) I think I’d rather be taking pictures but often I end up writing code.

M: Ok, we all benefit from that, so I’m going to wrap this up.  Once again, thank you very much Robin!

You can view the complete version of our discussion on the AAA.org’s Astrophotography group’s YouTube channel at https://youtu.be/JcRNnS5coi8

 

Note: This article is appearing simultaneously in Eyepiece, the newsletter of NYC based Amateur Astronomers Association of New York and SkyWAAtch, the newsletter of Westchester Amateur Astronomers.  I’m a member and supporter of both organizations.