What I do — Spacecraft Guidance, Navigation, and Control
An introduction to this newsletter, and what I do for a living
Thank you for opening the very first post of this newsletter! I hope to provide a good mix of my own musings and interviews of smart folks. After all, I'm only a young man with not nearly enough insight to share. If you’re blindsided by me starting a newsletter, have a look at the about page.
In this inaugural post, I’ll explain the what I do for a living—the discipline of guidance, navigation, and control (GNC for short). It’s actually a broader field than you’d think—one of my good friends does the same work but for underwater vehicles. I’ll start with what GNC is conceptually, and then describe how it’s done in practice.
An example is (almost) always best. Say we are engineers and we’ve built—or bought—a little four-wheel robot with its own computer and sensors. Our objective is to have the robot drive itself to the door of the room. At the expense of grossly oversimplifying, we’ve got three big problems to solve.
The first thing the robot needs to do is understand its current state. By state we mean its location, the direction it’s going, and how fast it’s moving. This is the job of navigation. In general, navigation is done by combining (mathematically) educated guesses of the state with external observations of the state. These external observations are things like radar measurements or camera images of the robot.
Once we have an idea of the robot state, the second hurdle is determining what the next best move the robot should make to reach its objective (get to the door). In this simple case, you could probably imagine the best option is to go in a straight line from where the robot currently is to where the door is. This is the job of guidance. We could make the situation more complex and add obstacles in the way between the robot and the door. Guidance must plan a trajectory that doesn’t involve bumping into the obstacles.
Lastly, once the robot knows where it is, and where it needs to go, it must figure out how it’s actually going to get there. This is the job of control. Control must tell the wheel motors to spin at a specific rate to move the robot in a stable and efficient way. It also commands the wheels to turn just the right amount to move in the direction that guidance tells it to go.
Courtesy of Intuitive Machines
Put simply: Guidance tells us where we want to go, navigation tells us where we are, and control gets us from where we are to where we want to go.
The general concepts, like I said before, are applied to almost any engineering system you can think of: planes, boats, submarines, spacecraft, and robots. I myself work on GNC for things like satellites and landers. I was fortunate enough to work at Intuitive Machines during the development of the IM-1 lunar lander Odysseus—pictured above. This means the kinds of sensors I work with are lasers, gyroscopes, and cameras. Now, I don’t actually work with these sensors much, I actually work on how we model them well in the software, and handle measurements that are generated from these sensors.
To the non-engineering readers, you may be asking, “how is this actually done?” We use mathematical models describing the physics of the system, along with models of the sensors, and implement those models in the software decision-making apparatus. Thus the GNC engineering life is much the life of a software developer! Like my wife loves to say when people think I am a rocket scientist, “he codes and makes graphs.” It’s true.
That’s it for now—more musings and conversations incoming.
-William Fife